Mar 18 13:02:30 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 13:02:30 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:30 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:02:31 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 13:02:31 crc kubenswrapper[4912]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.973976 4912 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979261 4912 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979290 4912 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979297 4912 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979304 4912 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979310 4912 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979317 4912 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979322 4912 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979328 4912 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979333 4912 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979338 4912 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979343 4912 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979348 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979353 4912 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979358 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979362 4912 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979367 4912 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979372 4912 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979377 4912 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979382 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979387 4912 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979392 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979397 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979402 4912 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979416 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979421 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979426 4912 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979431 4912 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979436 4912 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979442 4912 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979446 4912 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979451 4912 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979457 4912 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979461 4912 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979466 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979471 4912 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979475 4912 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979480 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979485 4912 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979490 4912 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979495 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979500 4912 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979505 4912 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979510 4912 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979515 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979521 4912 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979527 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979533 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979540 4912 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979546 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979552 4912 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979557 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979562 4912 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979568 4912 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979575 4912 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979580 4912 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979586 4912 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979591 4912 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979597 4912 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979603 4912 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979609 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979615 4912 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979622 4912 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979628 4912 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979634 4912 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979639 4912 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979644 4912 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979648 4912 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979654 4912 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979660 4912 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979665 4912 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.979670 4912 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980783 4912 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980805 4912 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980818 4912 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980826 4912 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980835 4912 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980841 4912 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980850 4912 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980857 4912 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980864 4912 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980871 4912 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980877 4912 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980883 4912 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980889 4912 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980895 4912 flags.go:64] FLAG: --cgroup-root="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980901 4912 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980907 4912 flags.go:64] FLAG: --client-ca-file="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980912 4912 flags.go:64] FLAG: --cloud-config="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980918 4912 flags.go:64] FLAG: --cloud-provider="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980923 4912 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980930 4912 flags.go:64] FLAG: --cluster-domain="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980936 4912 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980941 4912 flags.go:64] FLAG: --config-dir="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980947 4912 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980954 4912 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980962 4912 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980967 4912 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980974 4912 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980980 4912 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980986 4912 flags.go:64] FLAG: --contention-profiling="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980992 4912 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.980998 4912 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981003 4912 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981009 4912 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981017 4912 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981024 4912 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981031 4912 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981070 4912 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981078 4912 flags.go:64] FLAG: --enable-server="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981085 4912 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981096 4912 flags.go:64] FLAG: --event-burst="100" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981105 4912 flags.go:64] FLAG: --event-qps="50" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981111 4912 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981118 4912 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981125 4912 flags.go:64] FLAG: --eviction-hard="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981133 4912 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981139 4912 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981145 4912 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981152 4912 flags.go:64] FLAG: --eviction-soft="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981158 4912 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981163 4912 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981169 4912 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981175 4912 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981182 4912 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981188 4912 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981193 4912 flags.go:64] FLAG: --feature-gates="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981201 4912 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981206 4912 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981212 4912 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981218 4912 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981224 4912 flags.go:64] FLAG: --healthz-port="10248" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981230 4912 flags.go:64] FLAG: --help="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981236 4912 flags.go:64] FLAG: --hostname-override="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981242 4912 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981248 4912 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981253 4912 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981259 4912 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981265 4912 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981271 4912 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981277 4912 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981284 4912 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981290 4912 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981296 4912 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981302 4912 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981308 4912 flags.go:64] FLAG: --kube-reserved="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981313 4912 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981319 4912 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981326 4912 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981332 4912 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981337 4912 flags.go:64] FLAG: --lock-file="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981343 4912 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981349 4912 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981356 4912 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981364 4912 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981370 4912 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981376 4912 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981381 4912 flags.go:64] FLAG: --logging-format="text" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981387 4912 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981394 4912 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981399 4912 flags.go:64] FLAG: --manifest-url="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981405 4912 flags.go:64] FLAG: --manifest-url-header="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981413 4912 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981419 4912 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981426 4912 flags.go:64] FLAG: --max-pods="110" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981432 4912 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981438 4912 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981444 4912 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981449 4912 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981455 4912 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981461 4912 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981468 4912 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981483 4912 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981489 4912 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981495 4912 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981501 4912 flags.go:64] FLAG: --pod-cidr="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981507 4912 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981518 4912 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981523 4912 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981530 4912 flags.go:64] FLAG: --pods-per-core="0" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981535 4912 flags.go:64] FLAG: --port="10250" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981542 4912 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981547 4912 flags.go:64] FLAG: --provider-id="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981553 4912 flags.go:64] FLAG: --qos-reserved="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981559 4912 flags.go:64] FLAG: --read-only-port="10255" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981565 4912 flags.go:64] FLAG: --register-node="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981571 4912 flags.go:64] FLAG: --register-schedulable="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981577 4912 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981587 4912 flags.go:64] FLAG: --registry-burst="10" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981593 4912 flags.go:64] FLAG: --registry-qps="5" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981599 4912 flags.go:64] FLAG: --reserved-cpus="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981604 4912 flags.go:64] FLAG: --reserved-memory="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981611 4912 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981617 4912 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981623 4912 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981629 4912 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981635 4912 flags.go:64] FLAG: --runonce="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981641 4912 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981647 4912 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981653 4912 flags.go:64] FLAG: --seccomp-default="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981659 4912 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981664 4912 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981671 4912 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981677 4912 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981684 4912 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981690 4912 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981696 4912 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981702 4912 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981708 4912 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981715 4912 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981721 4912 flags.go:64] FLAG: --system-cgroups="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981727 4912 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981736 4912 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981742 4912 flags.go:64] FLAG: --tls-cert-file="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981747 4912 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981754 4912 flags.go:64] FLAG: --tls-min-version="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981759 4912 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981765 4912 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981771 4912 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981777 4912 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981784 4912 flags.go:64] FLAG: --v="2" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981793 4912 flags.go:64] FLAG: --version="false" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981801 4912 flags.go:64] FLAG: --vmodule="" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981808 4912 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.981814 4912 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.981999 4912 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982006 4912 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982012 4912 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982017 4912 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982023 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982029 4912 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982037 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982063 4912 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982069 4912 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982074 4912 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982079 4912 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982084 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982089 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982094 4912 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982099 4912 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982105 4912 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982111 4912 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982117 4912 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982122 4912 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982126 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982132 4912 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982137 4912 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982142 4912 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982147 4912 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982151 4912 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982156 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982161 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982166 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982171 4912 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982175 4912 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982180 4912 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982186 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982191 4912 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982198 4912 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982204 4912 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982209 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982214 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982219 4912 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982229 4912 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982234 4912 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982238 4912 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982243 4912 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982250 4912 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982262 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982267 4912 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982272 4912 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982277 4912 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982282 4912 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982287 4912 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982292 4912 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982297 4912 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982302 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982307 4912 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982312 4912 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982317 4912 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982323 4912 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982329 4912 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982335 4912 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982340 4912 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982346 4912 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982351 4912 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982356 4912 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982360 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982365 4912 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982370 4912 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982375 4912 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982380 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982386 4912 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982391 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982396 4912 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:02:31 crc kubenswrapper[4912]: W0318 13:02:31.982403 4912 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.983016 4912 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.994308 4912 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 13:02:31 crc kubenswrapper[4912]: I0318 13:02:31.994340 4912 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994446 4912 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994455 4912 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994461 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994467 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994472 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994477 4912 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994485 4912 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994493 4912 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994499 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994505 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994511 4912 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994516 4912 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994521 4912 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994526 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994531 4912 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994537 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994542 4912 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994547 4912 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994552 4912 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994557 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994562 4912 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994567 4912 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994572 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994577 4912 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994583 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994588 4912 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994593 4912 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994598 4912 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994605 4912 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994611 4912 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994619 4912 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994625 4912 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994631 4912 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994636 4912 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994642 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994647 4912 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994652 4912 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994657 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994662 4912 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994667 4912 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994672 4912 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994677 4912 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994682 4912 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994687 4912 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994692 4912 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994697 4912 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994702 4912 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994707 4912 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994713 4912 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994718 4912 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994723 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994728 4912 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994735 4912 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994740 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994746 4912 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994751 4912 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994756 4912 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994761 4912 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994766 4912 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994771 4912 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994776 4912 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994781 4912 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994787 4912 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994791 4912 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994796 4912 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994801 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994806 4912 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994811 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994816 4912 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994820 4912 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994826 4912 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:31.994835 4912 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.994986 4912 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995001 4912 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995008 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995014 4912 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995021 4912 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995028 4912 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995061 4912 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995069 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995076 4912 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995083 4912 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995088 4912 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995094 4912 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995101 4912 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995107 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995113 4912 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995120 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995126 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995130 4912 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995135 4912 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995140 4912 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995145 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995150 4912 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995155 4912 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995160 4912 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995165 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995170 4912 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995175 4912 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995180 4912 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995187 4912 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995193 4912 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995198 4912 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995203 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995208 4912 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995213 4912 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995219 4912 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995224 4912 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995229 4912 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995234 4912 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995239 4912 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995246 4912 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995251 4912 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995257 4912 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995263 4912 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995268 4912 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995273 4912 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995279 4912 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995284 4912 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995289 4912 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995295 4912 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995300 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995305 4912 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995310 4912 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995315 4912 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995320 4912 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995325 4912 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995330 4912 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995335 4912 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995339 4912 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995344 4912 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995349 4912 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995354 4912 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995359 4912 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995364 4912 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995368 4912 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995373 4912 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995378 4912 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995383 4912 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995388 4912 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995393 4912 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995398 4912 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:31.995403 4912 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:31.995411 4912 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:31.996328 4912 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.001496 4912 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.004857 4912 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.004955 4912 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.007208 4912 server.go:997] "Starting client certificate rotation" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.007236 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.007485 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.035019 4912 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.037743 4912 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.038483 4912 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.050328 4912 log.go:25] "Validated CRI v1 runtime API" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.091411 4912 log.go:25] "Validated CRI v1 image API" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.093973 4912 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.099393 4912 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-12-57-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.099492 4912 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.129673 4912 manager.go:217] Machine: {Timestamp:2026-03-18 13:02:32.125910879 +0000 UTC m=+0.585338404 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e10d31e1-6845-4aa5-a90d-99ca9bbe0732 BootID:b67b4615-c409-4182-8457-37817034d738 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6d:75:82 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6d:75:82 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:10:53:7d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:22:11:27 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f5:5e:f2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9b:52:65 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d6:07:d3:5e:8a:c9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5e:45:7e:00:0c:23 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.130397 4912 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.130737 4912 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.131347 4912 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.131934 4912 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.132019 4912 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.132532 4912 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.132559 4912 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.133183 4912 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.133256 4912 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.135488 4912 state_mem.go:36] "Initialized new in-memory state store" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.135666 4912 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.139887 4912 kubelet.go:418] "Attempting to sync node with API server" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.139939 4912 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.140066 4912 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.140099 4912 kubelet.go:324] "Adding apiserver pod source" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.140122 4912 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.145104 4912 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.146603 4912 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.146633 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.146741 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.146685 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.146849 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.148243 4912 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150157 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150203 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150246 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150275 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150309 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150329 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150353 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150379 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150396 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150412 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150467 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.150482 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.151413 4912 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.152358 4912 server.go:1280] "Started kubelet" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.155484 4912 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.155507 4912 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 13:02:32 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.158950 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.160884 4912 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.168822 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.168887 4912 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.169219 4912 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.169289 4912 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.169320 4912 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.169223 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.170117 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.170225 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171084 4912 factory.go:55] Registering systemd factory Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171119 4912 factory.go:221] Registration of the systemd container factory successfully Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171545 4912 factory.go:153] Registering CRI-O factory Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171589 4912 factory.go:221] Registration of the crio container factory successfully Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171763 4912 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171832 4912 factory.go:103] Registering Raw factory Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.171885 4912 manager.go:1196] Started watching for new ooms in manager Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.172464 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.170149 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.174264 4912 server.go:460] "Adding debug handlers to kubelet server" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.175538 4912 manager.go:319] Starting recovery of all containers Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.176820 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177081 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177093 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177103 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177113 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177123 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177132 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177141 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177152 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177161 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177169 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177177 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177186 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177198 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177207 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177217 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177225 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177234 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177246 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177257 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177575 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177585 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177633 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177651 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177660 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177668 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177735 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177744 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177754 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177764 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177794 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177804 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177812 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177821 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177830 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177926 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177937 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177946 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177975 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177984 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.177995 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178004 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178014 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178024 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178047 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178057 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178085 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178125 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178134 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178143 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178152 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178162 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178208 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178220 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178298 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178310 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178324 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178336 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178371 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178384 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178397 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178409 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178443 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178453 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178483 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178494 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178503 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178531 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178585 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178595 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178623 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178698 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178711 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178719 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178727 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178737 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178785 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178794 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178851 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178865 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178874 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178916 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178926 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178936 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178962 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178971 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.178996 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179012 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179020 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179031 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179177 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179187 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179195 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179204 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179232 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179242 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179292 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179302 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179311 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179319 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179333 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179342 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179378 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179389 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179473 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179485 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179511 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179569 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179579 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179588 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179631 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179642 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179652 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179661 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179669 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179678 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179707 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179766 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179813 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179825 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179833 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179841 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179868 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179876 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179917 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179926 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.179986 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180000 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180008 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180016 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180024 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180047 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180056 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180066 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180094 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180102 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180110 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180118 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180163 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180172 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180180 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180189 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180232 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180274 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180283 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180292 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180300 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180308 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180340 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180389 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180422 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180436 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180486 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180497 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180522 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180531 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180541 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180586 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180655 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180669 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180677 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180686 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180695 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180708 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180718 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180767 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180793 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180801 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180810 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180818 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180827 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180837 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180846 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180893 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180935 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180946 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180955 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180964 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.180972 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181017 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181133 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181143 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181167 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181175 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181188 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181243 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181268 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181277 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181285 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181293 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181336 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.181348 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.183579 4912 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.183607 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.183619 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184691 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184757 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184794 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184815 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184840 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184919 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.184952 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185059 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185084 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185106 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185187 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185209 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185244 4912 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185269 4912 reconstruct.go:97] "Volume reconstruction finished" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.185292 4912 reconciler.go:26] "Reconciler: start to sync state" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.215534 4912 manager.go:324] Recovery completed Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.223986 4912 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.226413 4912 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.226508 4912 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.226563 4912 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.226766 4912 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.228439 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.229141 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.234963 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.236529 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.236570 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.236582 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.237390 4912 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.237412 4912 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.237437 4912 state_mem.go:36] "Initialized new in-memory state store" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.254426 4912 policy_none.go:49] "None policy: Start" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.255817 4912 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.255879 4912 state_mem.go:35] "Initializing new in-memory state store" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.270259 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.308651 4912 manager.go:334] "Starting Device Plugin manager" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.308890 4912 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.308914 4912 server.go:79] "Starting device plugin registration server" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.309552 4912 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.309578 4912 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.309976 4912 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.310104 4912 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.310117 4912 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.319351 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.327597 4912 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.327728 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.330374 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.330417 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.330430 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.330619 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332209 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332350 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332391 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332429 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332463 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.332939 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.333008 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.333059 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.334844 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.334877 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.334891 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335090 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335127 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335192 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335304 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335323 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335333 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335490 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335611 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.335643 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337097 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337176 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337196 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337215 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337263 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.337289 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.338065 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.338106 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.338115 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343101 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343139 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343160 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343459 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343486 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343500 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343822 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.343866 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.347112 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.347174 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.347189 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.374212 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388473 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388659 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388684 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388754 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388781 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388831 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388861 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388908 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388929 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.388953 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.389121 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.389176 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.389202 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.389244 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.409989 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.411414 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.411460 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.411498 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.411530 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.412144 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491354 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491453 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491491 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491580 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491618 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491618 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491653 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491745 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491648 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491824 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491853 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491887 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491916 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491946 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491945 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491746 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492088 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491985 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491718 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492234 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492016 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492345 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491986 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492406 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.491694 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492443 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492107 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492483 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492488 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.492628 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.613174 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.615009 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.615111 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.615131 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.615173 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.615704 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.666589 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.673820 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.691156 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.691418 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.706784 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: I0318 13:02:32.713618 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.741131 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-32189bfb395d259c7bb73433087fd21ea56af844fb61cd57414210dc1c830e78 WatchSource:0}: Error finding container 32189bfb395d259c7bb73433087fd21ea56af844fb61cd57414210dc1c830e78: Status 404 returned error can't find the container with id 32189bfb395d259c7bb73433087fd21ea56af844fb61cd57414210dc1c830e78 Mar 18 13:02:32 crc kubenswrapper[4912]: W0318 13:02:32.747110 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-33b06db5904e182063359b9f75b360c44f71a422cbe05a14fcd58c5f0e9ecf0f WatchSource:0}: Error finding container 33b06db5904e182063359b9f75b360c44f71a422cbe05a14fcd58c5f0e9ecf0f: Status 404 returned error can't find the container with id 33b06db5904e182063359b9f75b360c44f71a422cbe05a14fcd58c5f0e9ecf0f Mar 18 13:02:32 crc kubenswrapper[4912]: E0318 13:02:32.775581 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.016136 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.017621 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.017677 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.017690 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.017720 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.018285 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.160625 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.232483 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48851281ca3878ff492dcce2bb98dda0eb16967f1f3949ed94b9f7f5482ffd05"} Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.233220 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"32189bfb395d259c7bb73433087fd21ea56af844fb61cd57414210dc1c830e78"} Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.233998 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"446eab7499c5c20696a57af603dee26290b1822e573c5b33d02f3d2859f2f121"} Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.234720 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe4e61d597c877d09f49853ca08ba8f66cb665e8ad2dfb9436efe59caafb9ffe"} Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.235383 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"33b06db5904e182063359b9f75b360c44f71a422cbe05a14fcd58c5f0e9ecf0f"} Mar 18 13:02:33 crc kubenswrapper[4912]: W0318 13:02:33.431190 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.431771 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:33 crc kubenswrapper[4912]: W0318 13:02:33.487831 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.488095 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.577350 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 18 13:02:33 crc kubenswrapper[4912]: W0318 13:02:33.682931 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.683094 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:33 crc kubenswrapper[4912]: W0318 13:02:33.766664 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.766771 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.819090 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.820257 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.820287 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.820295 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:33 crc kubenswrapper[4912]: I0318 13:02:33.820318 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:33 crc kubenswrapper[4912]: E0318 13:02:33.820706 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.160524 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.234971 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:02:34 crc kubenswrapper[4912]: E0318 13:02:34.236221 4912 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.246655 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8626806a0666bb59873d0ebbd6f2fb17844b96e435d991fdbc008ab81421ce2b"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.246714 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"44d9e9791f7de2a510a1884706a874d1720878d144899d8381590e2c70e67275"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.246756 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb89ce0333288e4b2c7ce5ce243c1beacd93814565f95b08c90f42612958dc65"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.246768 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"033c8d97bacf98c7ec2e36fad49fb41b161b92e3d0ea907012c00b4248974787"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.246839 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.248693 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.248950 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.248963 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.249520 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad" exitCode=0 Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.249689 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.249669 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.251056 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.251153 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.251170 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.252353 4912 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6437e255924d19a4445de729bcbe9d72963444c10b4190451a8db12b324ec746" exitCode=0 Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.252452 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6437e255924d19a4445de729bcbe9d72963444c10b4190451a8db12b324ec746"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.252549 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.254635 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.254689 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.254701 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.257991 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.259095 4912 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d8499f8c5d2a81c06e6106f6ab56bf17183f54220d435bdeae0ca58252669842" exitCode=0 Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.259231 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d8499f8c5d2a81c06e6106f6ab56bf17183f54220d435bdeae0ca58252669842"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.259318 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.260362 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.260398 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.260413 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.261431 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.261459 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.261474 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.263422 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7c787418ab2fb3ccbc993b70080c81608ad93fd02b7ef930e5d1466666269147"} Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.263557 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.263393 4912 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7c787418ab2fb3ccbc993b70080c81608ad93fd02b7ef930e5d1466666269147" exitCode=0 Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.264597 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.264625 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:34 crc kubenswrapper[4912]: I0318 13:02:34.264637 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.161163 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:35 crc kubenswrapper[4912]: E0318 13:02:35.180435 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.269523 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.269532 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1ee1531d31c09d1cc289343474f020aa56fd1e720b21af9d08737b3a53ddeef1"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.269668 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"95ecb4f9aedeef5fd2eea24dc745937c722f7573adef4680287c8189f3537acd"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.269700 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5616d0ea30c996f957e61d70fee5cf58da3f907eb3a39b8a61dd9d1a43e206e9"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.270261 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.270290 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.270299 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.271589 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.271619 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.271632 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.271642 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.272893 4912 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b1a1a5d33648c44d5c9b46c881c2f46422754dc88864b7f442aa9f8b4cd19cd8" exitCode=0 Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.272936 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b1a1a5d33648c44d5c9b46c881c2f46422754dc88864b7f442aa9f8b4cd19cd8"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.273012 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.273727 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.273755 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.273765 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.275792 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d835723dd70350dbcf7d8005d050ec6c582714a3438540714c4f809bf9775b03"} Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.275848 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.275852 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.276498 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.276527 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.276545 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.276802 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.278089 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.279682 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: W0318 13:02:35.359672 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 18 13:02:35 crc kubenswrapper[4912]: E0318 13:02:35.359761 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.420959 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.422840 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.422888 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.422900 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.422938 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:35 crc kubenswrapper[4912]: E0318 13:02:35.423492 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 18 13:02:35 crc kubenswrapper[4912]: I0318 13:02:35.819661 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.282099 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e261252b5533fa62b0114ff661ecccea7d865ae2b09cb1d2d8dd8608440f104"} Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.282178 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.282979 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.283014 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.283031 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284603 4912 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="09bf761753b011f29c4271d904d37cf70dd91674fc5f929c4446f9407a74f403" exitCode=0 Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284630 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"09bf761753b011f29c4271d904d37cf70dd91674fc5f929c4446f9407a74f403"} Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284677 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284759 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284801 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284878 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.284811 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285315 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285333 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285341 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285958 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285987 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.285996 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286275 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286305 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286316 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286427 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286462 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.286479 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.291514 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:36 crc kubenswrapper[4912]: I0318 13:02:36.765395 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290822 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"10933336579a2ce8ced354a680fc841616e6a174cdae2b2898b3ea0542b3281b"} Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58247da5b90e9a1249b5aa03ece821962270dc8b63310cee83585fca56c34906"} Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290892 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e40107e644944595aafe11abebff18b0b31a2324a71d49a5b9ebc19436065244"} Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290929 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290985 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.290988 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292173 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292193 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292201 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292559 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292583 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:37 crc kubenswrapper[4912]: I0318 13:02:37.292592 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.300204 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72fc4ad1dd1763fccf9a82f69fd1b3c6df3d42727582ef54f0aa09f06a6d2fe8"} Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.300277 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"404a60aa922eb512750855fc2a821c3592d76bd7bd5f2d69e7ebd7810b9dc8a5"} Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.300311 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.300345 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301433 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301473 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301489 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301660 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301743 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.301767 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.419875 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.420170 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.421519 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.421617 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.421647 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.489971 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.624344 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.626303 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.626371 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.626382 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:38 crc kubenswrapper[4912]: I0318 13:02:38.626415 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:39 crc kubenswrapper[4912]: I0318 13:02:39.302835 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:39 crc kubenswrapper[4912]: I0318 13:02:39.304266 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:39 crc kubenswrapper[4912]: I0318 13:02:39.304374 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:39 crc kubenswrapper[4912]: I0318 13:02:39.304396 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:39 crc kubenswrapper[4912]: I0318 13:02:39.533628 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.305291 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.306673 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.306725 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.306749 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.430725 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.430956 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.432723 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.432796 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:40 crc kubenswrapper[4912]: I0318 13:02:40.432823 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.376365 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.376617 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.378193 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.378254 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.378272 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.420089 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:02:41 crc kubenswrapper[4912]: I0318 13:02:41.420206 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:02:42 crc kubenswrapper[4912]: E0318 13:02:42.319540 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.423595 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.423873 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.425755 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.425822 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.425845 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:42 crc kubenswrapper[4912]: I0318 13:02:42.429757 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:43 crc kubenswrapper[4912]: I0318 13:02:43.314138 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:43 crc kubenswrapper[4912]: I0318 13:02:43.315631 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:43 crc kubenswrapper[4912]: I0318 13:02:43.315684 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:43 crc kubenswrapper[4912]: I0318 13:02:43.315701 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:43 crc kubenswrapper[4912]: I0318 13:02:43.320238 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:02:44 crc kubenswrapper[4912]: I0318 13:02:44.316155 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:44 crc kubenswrapper[4912]: I0318 13:02:44.317555 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:44 crc kubenswrapper[4912]: I0318 13:02:44.317587 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:44 crc kubenswrapper[4912]: I0318 13:02:44.317600 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:45 crc kubenswrapper[4912]: W0318 13:02:45.678328 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 13:02:45 crc kubenswrapper[4912]: I0318 13:02:45.679477 4912 trace.go:236] Trace[380863901]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 13:02:35.676) (total time: 10002ms): Mar 18 13:02:45 crc kubenswrapper[4912]: Trace[380863901]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:02:45.678) Mar 18 13:02:45 crc kubenswrapper[4912]: Trace[380863901]: [10.002574452s] [10.002574452s] END Mar 18 13:02:45 crc kubenswrapper[4912]: E0318 13:02:45.679628 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.038732 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.040086 4912 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.042280 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 13:02:46 crc kubenswrapper[4912]: W0318 13:02:46.043696 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.043776 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.046309 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z Mar 18 13:02:46 crc kubenswrapper[4912]: W0318 13:02:46.048485 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.048616 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.049963 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.050801 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.050889 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 13:02:46 crc kubenswrapper[4912]: W0318 13:02:46.051895 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z Mar 18 13:02:46 crc kubenswrapper[4912]: E0318 13:02:46.051987 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.054759 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.054825 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.164694 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:46Z is after 2026-02-23T05:33:13Z Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.322792 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.324217 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8e261252b5533fa62b0114ff661ecccea7d865ae2b09cb1d2d8dd8608440f104" exitCode=255 Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.324255 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8e261252b5533fa62b0114ff661ecccea7d865ae2b09cb1d2d8dd8608440f104"} Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.324398 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.325182 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.325205 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.325215 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.325710 4912 scope.go:117] "RemoveContainer" containerID="8e261252b5533fa62b0114ff661ecccea7d865ae2b09cb1d2d8dd8608440f104" Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.774992 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]log ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]etcd ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-apiextensions-informers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-apiextensions-controllers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/crd-informer-synced ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 13:02:46 crc kubenswrapper[4912]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/bootstrap-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-registration-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]autoregister-completion ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 13:02:46 crc kubenswrapper[4912]: livez check failed Mar 18 13:02:46 crc kubenswrapper[4912]: I0318 13:02:46.775112 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.163561 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:47Z is after 2026-02-23T05:33:13Z Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.329928 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.331969 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13"} Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.332162 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.332991 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.333025 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:47 crc kubenswrapper[4912]: I0318 13:02:47.333050 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.164502 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:48Z is after 2026-02-23T05:33:13Z Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.339380 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.340018 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.343070 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" exitCode=255 Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.343105 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13"} Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.343469 4912 scope.go:117] "RemoveContainer" containerID="8e261252b5533fa62b0114ff661ecccea7d865ae2b09cb1d2d8dd8608440f104" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.343592 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.344832 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.344865 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.344878 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:48 crc kubenswrapper[4912]: I0318 13:02:48.345620 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:02:48 crc kubenswrapper[4912]: E0318 13:02:48.345865 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.163559 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:49Z is after 2026-02-23T05:33:13Z Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.349324 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.564190 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.564444 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.565777 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.565828 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.565840 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:49 crc kubenswrapper[4912]: I0318 13:02:49.580696 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 13:02:50 crc kubenswrapper[4912]: W0318 13:02:50.136541 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:50Z is after 2026-02-23T05:33:13Z Mar 18 13:02:50 crc kubenswrapper[4912]: E0318 13:02:50.136629 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:50 crc kubenswrapper[4912]: W0318 13:02:50.156151 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:50Z is after 2026-02-23T05:33:13Z Mar 18 13:02:50 crc kubenswrapper[4912]: E0318 13:02:50.156240 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:50 crc kubenswrapper[4912]: I0318 13:02:50.163964 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:50Z is after 2026-02-23T05:33:13Z Mar 18 13:02:50 crc kubenswrapper[4912]: I0318 13:02:50.354939 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:50 crc kubenswrapper[4912]: I0318 13:02:50.356491 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:50 crc kubenswrapper[4912]: I0318 13:02:50.356562 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:50 crc kubenswrapper[4912]: I0318 13:02:50.356586 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.164855 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:51Z is after 2026-02-23T05:33:13Z Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.421198 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.421367 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:02:51 crc kubenswrapper[4912]: W0318 13:02:51.539450 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:51Z is after 2026-02-23T05:33:13Z Mar 18 13:02:51 crc kubenswrapper[4912]: E0318 13:02:51.539569 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.771824 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.772018 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.773382 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.773435 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.773448 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.774206 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:02:51 crc kubenswrapper[4912]: E0318 13:02:51.774394 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:02:51 crc kubenswrapper[4912]: I0318 13:02:51.776767 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.166215 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:52Z is after 2026-02-23T05:33:13Z Mar 18 13:02:52 crc kubenswrapper[4912]: E0318 13:02:52.319708 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.360333 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.361259 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.361314 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.361323 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.361871 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:02:52 crc kubenswrapper[4912]: E0318 13:02:52.362048 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.443327 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:52 crc kubenswrapper[4912]: E0318 13:02:52.444620 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:52Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.445239 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.445327 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.445347 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.445382 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:52 crc kubenswrapper[4912]: E0318 13:02:52.449814 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:52Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 13:02:52 crc kubenswrapper[4912]: I0318 13:02:52.888513 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.167302 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:53Z is after 2026-02-23T05:33:13Z Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.363622 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.364836 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.364875 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.364887 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:53 crc kubenswrapper[4912]: I0318 13:02:53.365621 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:02:53 crc kubenswrapper[4912]: E0318 13:02:53.365850 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:02:54 crc kubenswrapper[4912]: I0318 13:02:54.069944 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:02:54 crc kubenswrapper[4912]: E0318 13:02:54.077151 4912 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:54 crc kubenswrapper[4912]: I0318 13:02:54.165232 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:54Z is after 2026-02-23T05:33:13Z Mar 18 13:02:55 crc kubenswrapper[4912]: I0318 13:02:55.163779 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:55Z is after 2026-02-23T05:33:13Z Mar 18 13:02:56 crc kubenswrapper[4912]: E0318 13:02:56.055767 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:56Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:02:56 crc kubenswrapper[4912]: I0318 13:02:56.163919 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:56Z is after 2026-02-23T05:33:13Z Mar 18 13:02:56 crc kubenswrapper[4912]: W0318 13:02:56.262505 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:56Z is after 2026-02-23T05:33:13Z Mar 18 13:02:56 crc kubenswrapper[4912]: E0318 13:02:56.262588 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:57 crc kubenswrapper[4912]: I0318 13:02:57.165616 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:57Z is after 2026-02-23T05:33:13Z Mar 18 13:02:57 crc kubenswrapper[4912]: W0318 13:02:57.355878 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:57Z is after 2026-02-23T05:33:13Z Mar 18 13:02:57 crc kubenswrapper[4912]: E0318 13:02:57.356008 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.163332 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:58Z is after 2026-02-23T05:33:13Z Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.312467 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.312693 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.314399 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.314460 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.314479 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:58 crc kubenswrapper[4912]: I0318 13:02:58.315530 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.164925 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:59Z is after 2026-02-23T05:33:13Z Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.382383 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.385895 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137"} Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.386176 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.387511 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.387567 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.387589 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:59 crc kubenswrapper[4912]: E0318 13:02:59.449748 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:59Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.450897 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.452733 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.452800 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.452817 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:02:59 crc kubenswrapper[4912]: I0318 13:02:59.452858 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:02:59 crc kubenswrapper[4912]: E0318 13:02:59.456697 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:02:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.165463 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:00Z is after 2026-02-23T05:33:13Z Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.392271 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.393248 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.395812 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" exitCode=255 Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.395881 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137"} Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.395944 4912 scope.go:117] "RemoveContainer" containerID="dbfcbe8647b048d512341eedee10b477ef969305c5265641c9bd05f353e04b13" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.396217 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.398950 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.399005 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.399025 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:00 crc kubenswrapper[4912]: I0318 13:03:00.399899 4912 scope.go:117] "RemoveContainer" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" Mar 18 13:03:00 crc kubenswrapper[4912]: E0318 13:03:00.400232 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:00 crc kubenswrapper[4912]: W0318 13:03:00.623885 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:00Z is after 2026-02-23T05:33:13Z Mar 18 13:03:00 crc kubenswrapper[4912]: E0318 13:03:00.623998 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.164687 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:01Z is after 2026-02-23T05:33:13Z Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.404543 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.420906 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.421011 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.421155 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.421414 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.422921 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.422971 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.422982 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.423540 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"eb89ce0333288e4b2c7ce5ce243c1beacd93814565f95b08c90f42612958dc65"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 13:03:01 crc kubenswrapper[4912]: I0318 13:03:01.423702 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://eb89ce0333288e4b2c7ce5ce243c1beacd93814565f95b08c90f42612958dc65" gracePeriod=30 Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.165370 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:02Z is after 2026-02-23T05:33:13Z Mar 18 13:03:02 crc kubenswrapper[4912]: E0318 13:03:02.319879 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.417714 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.418622 4912 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="eb89ce0333288e4b2c7ce5ce243c1beacd93814565f95b08c90f42612958dc65" exitCode=255 Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.418705 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"eb89ce0333288e4b2c7ce5ce243c1beacd93814565f95b08c90f42612958dc65"} Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.418832 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"838c29b7d7f7c686b912cdbe45d9cf7a1a2960450df909a597205383d2ef2d64"} Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.419093 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.424874 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.424957 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.424979 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:02 crc kubenswrapper[4912]: W0318 13:03:02.765127 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:02Z is after 2026-02-23T05:33:13Z Mar 18 13:03:02 crc kubenswrapper[4912]: E0318 13:03:02.765257 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.888704 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.889500 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.891606 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.891681 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.891709 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:02 crc kubenswrapper[4912]: I0318 13:03:02.892868 4912 scope.go:117] "RemoveContainer" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" Mar 18 13:03:02 crc kubenswrapper[4912]: E0318 13:03:02.893239 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:03 crc kubenswrapper[4912]: I0318 13:03:03.166300 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:03Z is after 2026-02-23T05:33:13Z Mar 18 13:03:04 crc kubenswrapper[4912]: I0318 13:03:04.164332 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:04Z is after 2026-02-23T05:33:13Z Mar 18 13:03:05 crc kubenswrapper[4912]: I0318 13:03:05.166123 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:05Z is after 2026-02-23T05:33:13Z Mar 18 13:03:06 crc kubenswrapper[4912]: E0318 13:03:06.060629 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:06Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.163720 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:06Z is after 2026-02-23T05:33:13Z Mar 18 13:03:06 crc kubenswrapper[4912]: E0318 13:03:06.454681 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.457025 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.458514 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.458553 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.458568 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:06 crc kubenswrapper[4912]: I0318 13:03:06.458597 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:06 crc kubenswrapper[4912]: E0318 13:03:06.463380 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 13:03:07 crc kubenswrapper[4912]: I0318 13:03:07.165258 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:07Z is after 2026-02-23T05:33:13Z Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.163879 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:08Z is after 2026-02-23T05:33:13Z Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.312572 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.312880 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.314917 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.314984 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.315004 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.316099 4912 scope.go:117] "RemoveContainer" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" Mar 18 13:03:08 crc kubenswrapper[4912]: E0318 13:03:08.316427 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.420224 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.420552 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.422257 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.422318 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:08 crc kubenswrapper[4912]: I0318 13:03:08.422337 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:09 crc kubenswrapper[4912]: I0318 13:03:09.163872 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:09Z is after 2026-02-23T05:33:13Z Mar 18 13:03:09 crc kubenswrapper[4912]: W0318 13:03:09.862271 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:09Z is after 2026-02-23T05:33:13Z Mar 18 13:03:09 crc kubenswrapper[4912]: E0318 13:03:09.862389 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.164763 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:10Z is after 2026-02-23T05:33:13Z Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.430822 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.431159 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.432818 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.432881 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:10 crc kubenswrapper[4912]: I0318 13:03:10.432897 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:11 crc kubenswrapper[4912]: I0318 13:03:11.165253 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:11Z is after 2026-02-23T05:33:13Z Mar 18 13:03:11 crc kubenswrapper[4912]: I0318 13:03:11.296111 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:03:11 crc kubenswrapper[4912]: E0318 13:03:11.301821 4912 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 13:03:11 crc kubenswrapper[4912]: E0318 13:03:11.303071 4912 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 13:03:11 crc kubenswrapper[4912]: I0318 13:03:11.420902 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:03:11 crc kubenswrapper[4912]: I0318 13:03:11.421025 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:03:12 crc kubenswrapper[4912]: I0318 13:03:12.165587 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:12Z is after 2026-02-23T05:33:13Z Mar 18 13:03:12 crc kubenswrapper[4912]: E0318 13:03:12.320122 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.165455 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:13Z is after 2026-02-23T05:33:13Z Mar 18 13:03:13 crc kubenswrapper[4912]: E0318 13:03:13.461095 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.464376 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.466424 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.466490 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.466509 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:13 crc kubenswrapper[4912]: I0318 13:03:13.466550 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:13 crc kubenswrapper[4912]: E0318 13:03:13.470368 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 13:03:14 crc kubenswrapper[4912]: I0318 13:03:14.163882 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:03:14Z is after 2026-02-23T05:33:13Z Mar 18 13:03:15 crc kubenswrapper[4912]: I0318 13:03:15.165642 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:15 crc kubenswrapper[4912]: W0318 13:03:15.655190 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 13:03:15 crc kubenswrapper[4912]: E0318 13:03:15.655266 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.067536 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120f7c9e46e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,LastTimestamp:2026-03-18 13:02:32.15229451 +0000 UTC m=+0.611722025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.072271 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.077753 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.082399 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.086752 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df12101540b50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.312343376 +0000 UTC m=+0.771770801,LastTimestamp:2026-03-18 13:02:32.312343376 +0000 UTC m=+0.771770801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.091366 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.330406308 +0000 UTC m=+0.789833743,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.096358 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.330425428 +0000 UTC m=+0.789852863,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.100989 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.330440619 +0000 UTC m=+0.789868054,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.105652 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.332342142 +0000 UTC m=+0.791769597,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.109259 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.332451014 +0000 UTC m=+0.791878479,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.112919 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.332476095 +0000 UTC m=+0.791903550,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.117256 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.334866649 +0000 UTC m=+0.794294084,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.121194 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.3348852 +0000 UTC m=+0.794312635,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.124856 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.33489855 +0000 UTC m=+0.794325985,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.128383 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.335113215 +0000 UTC m=+0.794540680,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.131917 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.335183797 +0000 UTC m=+0.794611252,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.135520 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.335202337 +0000 UTC m=+0.794629802,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.139055 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.33531738 +0000 UTC m=+0.794744815,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.142636 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.33532929 +0000 UTC m=+0.794756725,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.146284 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.33533998 +0000 UTC m=+0.794767415,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.151566 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.337124261 +0000 UTC m=+0.796551716,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.158103 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccfb182\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccfb182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23656077 +0000 UTC m=+0.695988195,LastTimestamp:2026-03-18 13:02:32.337189863 +0000 UTC m=+0.796617318,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: I0318 13:03:16.163458 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.164347 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.337205793 +0000 UTC m=+0.796633248,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.166875 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fcd01c45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fcd01c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.236588101 +0000 UTC m=+0.696015526,LastTimestamp:2026-03-18 13:02:32.337225573 +0000 UTC m=+0.796653038,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.171079 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df120fccff606\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df120fccff606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.23657831 +0000 UTC m=+0.696005735,LastTimestamp:2026-03-18 13:02:32.337274094 +0000 UTC m=+0.796701559,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.176625 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df1211ad9d889 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.740542601 +0000 UTC m=+1.199970026,LastTimestamp:2026-03-18 13:02:32.740542601 +0000 UTC m=+1.199970026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.180721 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1211adab0a2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.740597922 +0000 UTC m=+1.200025367,LastTimestamp:2026-03-18 13:02:32.740597922 +0000 UTC m=+1.200025367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.184900 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1211b484332 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.747778866 +0000 UTC m=+1.207206291,LastTimestamp:2026-03-18 13:02:32.747778866 +0000 UTC m=+1.207206291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.188858 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1211bb8d7e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.755156964 +0000 UTC m=+1.214584389,LastTimestamp:2026-03-18 13:02:32.755156964 +0000 UTC m=+1.214584389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.195357 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1211bd0bd2e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:32.75672299 +0000 UTC m=+1.216150425,LastTimestamp:2026-03-18 13:02:32.75672299 +0000 UTC m=+1.216150425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.199975 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1213c57eb78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.302453112 +0000 UTC m=+1.761880537,LastTimestamp:2026-03-18 13:02:33.302453112 +0000 UTC m=+1.761880537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.205013 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1213c582b60 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.302469472 +0000 UTC m=+1.761896887,LastTimestamp:2026-03-18 13:02:33.302469472 +0000 UTC m=+1.761896887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.209100 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df1213c7a08b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.304688823 +0000 UTC m=+1.764116248,LastTimestamp:2026-03-18 13:02:33.304688823 +0000 UTC m=+1.764116248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.213195 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1213c852761 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.305417569 +0000 UTC m=+1.764844994,LastTimestamp:2026-03-18 13:02:33.305417569 +0000 UTC m=+1.764844994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.217237 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1213c88b8af openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.305651375 +0000 UTC m=+1.765078810,LastTimestamp:2026-03-18 13:02:33.305651375 +0000 UTC m=+1.765078810,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.221121 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1213cf2934f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.312588623 +0000 UTC m=+1.772016048,LastTimestamp:2026-03-18 13:02:33.312588623 +0000 UTC m=+1.772016048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.224486 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1213d0586e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.313830631 +0000 UTC m=+1.773258066,LastTimestamp:2026-03-18 13:02:33.313830631 +0000 UTC m=+1.773258066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.227932 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1213d0e273c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.314395964 +0000 UTC m=+1.773823389,LastTimestamp:2026-03-18 13:02:33.314395964 +0000 UTC m=+1.773823389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.231111 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df1213d3ccc86 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.317452934 +0000 UTC m=+1.776880359,LastTimestamp:2026-03-18 13:02:33.317452934 +0000 UTC m=+1.776880359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.235242 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1213d4b1ca1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.318390945 +0000 UTC m=+1.777818360,LastTimestamp:2026-03-18 13:02:33.318390945 +0000 UTC m=+1.777818360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.238839 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1213d4b585b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.318406235 +0000 UTC m=+1.777833660,LastTimestamp:2026-03-18 13:02:33.318406235 +0000 UTC m=+1.777833660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.242140 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1214f14bcc4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.616817348 +0000 UTC m=+2.076244773,LastTimestamp:2026-03-18 13:02:33.616817348 +0000 UTC m=+2.076244773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.246476 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1214fb5ca28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.627372072 +0000 UTC m=+2.086799487,LastTimestamp:2026-03-18 13:02:33.627372072 +0000 UTC m=+2.086799487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.255695 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1214fcc355b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.628841307 +0000 UTC m=+2.088268722,LastTimestamp:2026-03-18 13:02:33.628841307 +0000 UTC m=+2.088268722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.260026 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1215a9db330 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.810342704 +0000 UTC m=+2.269770129,LastTimestamp:2026-03-18 13:02:33.810342704 +0000 UTC m=+2.269770129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.264928 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1215ba333e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.827480546 +0000 UTC m=+2.286907971,LastTimestamp:2026-03-18 13:02:33.827480546 +0000 UTC m=+2.286907971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.271211 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1215bb5aee5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.828691685 +0000 UTC m=+2.288119110,LastTimestamp:2026-03-18 13:02:33.828691685 +0000 UTC m=+2.288119110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.275973 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df12167789dc1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.026016193 +0000 UTC m=+2.485443618,LastTimestamp:2026-03-18 13:02:34.026016193 +0000 UTC m=+2.485443618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.281494 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df121682c2b82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.037783426 +0000 UTC m=+2.497210851,LastTimestamp:2026-03-18 13:02:34.037783426 +0000 UTC m=+2.497210851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.286418 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12175039682 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.25322765 +0000 UTC m=+2.712655085,LastTimestamp:2026-03-18 13:02:34.25322765 +0000 UTC m=+2.712655085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.292139 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1217530d53f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.256192831 +0000 UTC m=+2.715620276,LastTimestamp:2026-03-18 13:02:34.256192831 +0000 UTC m=+2.715620276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.297384 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df12175882cd9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.261916889 +0000 UTC m=+2.721344324,LastTimestamp:2026-03-18 13:02:34.261916889 +0000 UTC m=+2.721344324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.302723 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df12175dd0a4b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.267478603 +0000 UTC m=+2.726906038,LastTimestamp:2026-03-18 13:02:34.267478603 +0000 UTC m=+2.726906038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.307169 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df121830d6381 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.488750977 +0000 UTC m=+2.948178402,LastTimestamp:2026-03-18 13:02:34.488750977 +0000 UTC m=+2.948178402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.311615 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df12183164f95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.489335701 +0000 UTC m=+2.948763116,LastTimestamp:2026-03-18 13:02:34.489335701 +0000 UTC m=+2.948763116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.315989 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121831dc320 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.489824032 +0000 UTC m=+2.949251457,LastTimestamp:2026-03-18 13:02:34.489824032 +0000 UTC m=+2.949251457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.321025 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121831f3271 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.489918065 +0000 UTC m=+2.949345490,LastTimestamp:2026-03-18 13:02:34.489918065 +0000 UTC m=+2.949345490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.326085 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df12183f64efe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.504015614 +0000 UTC m=+2.963443039,LastTimestamp:2026-03-18 13:02:34.504015614 +0000 UTC m=+2.963443039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.329804 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df1218405a132 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.505019698 +0000 UTC m=+2.964447123,LastTimestamp:2026-03-18 13:02:34.505019698 +0000 UTC m=+2.964447123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.333687 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df12184076317 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.505134871 +0000 UTC m=+2.964562296,LastTimestamp:2026-03-18 13:02:34.505134871 +0000 UTC m=+2.964562296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.338146 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121841349ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.50591481 +0000 UTC m=+2.965342235,LastTimestamp:2026-03-18 13:02:34.50591481 +0000 UTC m=+2.965342235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.342101 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121844ac833 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.509551667 +0000 UTC m=+2.968979082,LastTimestamp:2026-03-18 13:02:34.509551667 +0000 UTC m=+2.968979082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.346395 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1218f7bd6a5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.697316005 +0000 UTC m=+3.156743430,LastTimestamp:2026-03-18 13:02:34.697316005 +0000 UTC m=+3.156743430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.351488 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df12190504304 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.71123738 +0000 UTC m=+3.170664825,LastTimestamp:2026-03-18 13:02:34.71123738 +0000 UTC m=+3.170664825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.355785 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12190514e2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.711305771 +0000 UTC m=+3.170733196,LastTimestamp:2026-03-18 13:02:34.711305771 +0000 UTC m=+3.170733196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.361463 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df12190680b35 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.712795957 +0000 UTC m=+3.172223382,LastTimestamp:2026-03-18 13:02:34.712795957 +0000 UTC m=+3.172223382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.365598 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12190e9f5f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.721310192 +0000 UTC m=+3.180737627,LastTimestamp:2026-03-18 13:02:34.721310192 +0000 UTC m=+3.180737627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.371119 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12190fc4b0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.722511631 +0000 UTC m=+3.181939056,LastTimestamp:2026-03-18 13:02:34.722511631 +0000 UTC m=+3.181939056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.376520 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1219bcd4337 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.903978807 +0000 UTC m=+3.363406232,LastTimestamp:2026-03-18 13:02:34.903978807 +0000 UTC m=+3.363406232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.381702 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1219c12b8db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.908530907 +0000 UTC m=+3.367958332,LastTimestamp:2026-03-18 13:02:34.908530907 +0000 UTC m=+3.367958332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.386639 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df1219c68e81e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.914179102 +0000 UTC m=+3.373606527,LastTimestamp:2026-03-18 13:02:34.914179102 +0000 UTC m=+3.373606527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.390474 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1219ce66670 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.92240344 +0000 UTC m=+3.381830865,LastTimestamp:2026-03-18 13:02:34.92240344 +0000 UTC m=+3.381830865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.395189 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df1219cf620bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:34.923434175 +0000 UTC m=+3.382861600,LastTimestamp:2026-03-18 13:02:34.923434175 +0000 UTC m=+3.382861600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.399923 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121a2d3bb49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.021843273 +0000 UTC m=+3.481270698,LastTimestamp:2026-03-18 13:02:35.021843273 +0000 UTC m=+3.481270698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.404556 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121a66f1231 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.082355249 +0000 UTC m=+3.541782674,LastTimestamp:2026-03-18 13:02:35.082355249 +0000 UTC m=+3.541782674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.409635 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121a709dd3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.092499773 +0000 UTC m=+3.551927198,LastTimestamp:2026-03-18 13:02:35.092499773 +0000 UTC m=+3.551927198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.414499 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121a7189eac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.093466796 +0000 UTC m=+3.552894211,LastTimestamp:2026-03-18 13:02:35.093466796 +0000 UTC m=+3.552894211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.418743 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121b1f1de2c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.275476524 +0000 UTC m=+3.734903949,LastTimestamp:2026-03-18 13:02:35.275476524 +0000 UTC m=+3.734903949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.422924 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121b2b0e313 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.287995155 +0000 UTC m=+3.747422580,LastTimestamp:2026-03-18 13:02:35.287995155 +0000 UTC m=+3.747422580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.426630 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121b3c07a2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.305794094 +0000 UTC m=+3.765221519,LastTimestamp:2026-03-18 13:02:35.305794094 +0000 UTC m=+3.765221519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.430146 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121bdcfd99d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.474573725 +0000 UTC m=+3.934001140,LastTimestamp:2026-03-18 13:02:35.474573725 +0000 UTC m=+3.934001140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.434220 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121bea826c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.488749256 +0000 UTC m=+3.948176681,LastTimestamp:2026-03-18 13:02:35.488749256 +0000 UTC m=+3.948176681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.439731 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121ee492ee6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.287831782 +0000 UTC m=+4.747259207,LastTimestamp:2026-03-18 13:02:36.287831782 +0000 UTC m=+4.747259207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.444927 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121fcd083b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.531581877 +0000 UTC m=+4.991009342,LastTimestamp:2026-03-18 13:02:36.531581877 +0000 UTC m=+4.991009342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.449230 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121fd602df9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.540997113 +0000 UTC m=+5.000424578,LastTimestamp:2026-03-18 13:02:36.540997113 +0000 UTC m=+5.000424578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.452931 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df121fd74d914 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.542351636 +0000 UTC m=+5.001779061,LastTimestamp:2026-03-18 13:02:36.542351636 +0000 UTC m=+5.001779061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.457886 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df12208ddfe08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.733791752 +0000 UTC m=+5.193219177,LastTimestamp:2026-03-18 13:02:36.733791752 +0000 UTC m=+5.193219177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.463308 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df12209c604c4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.748997828 +0000 UTC m=+5.208425263,LastTimestamp:2026-03-18 13:02:36.748997828 +0000 UTC m=+5.208425263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.468234 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df12209db2a0f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:36.750383631 +0000 UTC m=+5.209811056,LastTimestamp:2026-03-18 13:02:36.750383631 +0000 UTC m=+5.209811056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.474454 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1221b16867b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.039486587 +0000 UTC m=+5.498914022,LastTimestamp:2026-03-18 13:02:37.039486587 +0000 UTC m=+5.498914022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.479124 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1221c028af5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.054954229 +0000 UTC m=+5.514381654,LastTimestamp:2026-03-18 13:02:37.054954229 +0000 UTC m=+5.514381654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.486603 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1221c1d5891 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.056710801 +0000 UTC m=+5.516138216,LastTimestamp:2026-03-18 13:02:37.056710801 +0000 UTC m=+5.516138216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.493525 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df12229194ad8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.274548952 +0000 UTC m=+5.733976377,LastTimestamp:2026-03-18 13:02:37.274548952 +0000 UTC m=+5.733976377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.500668 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1222a420e0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.29399758 +0000 UTC m=+5.753425005,LastTimestamp:2026-03-18 13:02:37.29399758 +0000 UTC m=+5.753425005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.506135 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df1222a4fcaee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.294897902 +0000 UTC m=+5.754325327,LastTimestamp:2026-03-18 13:02:37.294897902 +0000 UTC m=+5.754325327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.512420 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df12237a8cc3e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.51883475 +0000 UTC m=+5.978262185,LastTimestamp:2026-03-18 13:02:37.51883475 +0000 UTC m=+5.978262185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.517566 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df122384f61ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:37.529752063 +0000 UTC m=+5.989179498,LastTimestamp:2026-03-18 13:02:37.529752063 +0000 UTC m=+5.989179498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.525096 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-controller-manager-crc.189df1232032768f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 13:03:16 crc kubenswrapper[4912]: body: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:41.420170895 +0000 UTC m=+9.879598360,LastTimestamp:2026-03-18 13:02:41.420170895 +0000 UTC m=+9.879598360,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.529971 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1232033c0e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:41.420255457 +0000 UTC m=+9.879682922,LastTimestamp:2026-03-18 13:02:41.420255457 +0000 UTC m=+9.879682922,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.534418 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-apiserver-crc.189df1243435424b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 13:03:16 crc kubenswrapper[4912]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 13:03:16 crc kubenswrapper[4912]: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:46.050865739 +0000 UTC m=+14.510293174,LastTimestamp:2026-03-18 13:02:46.050865739 +0000 UTC m=+14.510293174,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.539069 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12434360fed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:46.050918381 +0000 UTC m=+14.510345816,LastTimestamp:2026-03-18 13:02:46.050918381 +0000 UTC m=+14.510345816,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.544905 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df1243435424b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-apiserver-crc.189df1243435424b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 13:03:16 crc kubenswrapper[4912]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 13:03:16 crc kubenswrapper[4912]: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:46.050865739 +0000 UTC m=+14.510293174,LastTimestamp:2026-03-18 13:02:46.054805954 +0000 UTC m=+14.514233389,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.549614 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df12434360fed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df12434360fed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:46.050918381 +0000 UTC m=+14.510345816,LastTimestamp:2026-03-18 13:02:46.054850345 +0000 UTC m=+14.514277780,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.554195 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df121a7189eac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121a7189eac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.093466796 +0000 UTC m=+3.552894211,LastTimestamp:2026-03-18 13:02:46.327167657 +0000 UTC m=+14.786595082,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.560097 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df121b2b0e313\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121b2b0e313 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.287995155 +0000 UTC m=+3.747422580,LastTimestamp:2026-03-18 13:02:46.507303651 +0000 UTC m=+14.966731096,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.566400 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df121b3c07a2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df121b3c07a2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:35.305794094 +0000 UTC m=+3.765221519,LastTimestamp:2026-03-18 13:02:46.519490875 +0000 UTC m=+14.978918310,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.573486 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-controller-manager-crc.189df125744fc7eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:03:16 crc kubenswrapper[4912]: body: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421313003 +0000 UTC m=+19.880740508,LastTimestamp:2026-03-18 13:02:51.421313003 +0000 UTC m=+19.880740508,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.579599 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df125745197f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421431795 +0000 UTC m=+19.880859260,LastTimestamp:2026-03-18 13:02:51.421431795 +0000 UTC m=+19.880859260,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.589062 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df125744fc7eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-controller-manager-crc.189df125744fc7eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:03:16 crc kubenswrapper[4912]: body: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421313003 +0000 UTC m=+19.880740508,LastTimestamp:2026-03-18 13:03:01.42097417 +0000 UTC m=+29.880401635,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.594511 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df125745197f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df125745197f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421431795 +0000 UTC m=+19.880859260,LastTimestamp:2026-03-18 13:03:01.421108663 +0000 UTC m=+29.880536128,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.599793 4912 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df127c87fdd47 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:03:01.423684935 +0000 UTC m=+29.883112360,LastTimestamp:2026-03-18 13:03:01.423684935 +0000 UTC m=+29.883112360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.605093 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df1213d0e273c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1213d0e273c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.314395964 +0000 UTC m=+1.773823389,LastTimestamp:2026-03-18 13:03:01.544095032 +0000 UTC m=+30.003522467,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.611644 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df1214f14bcc4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1214f14bcc4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.616817348 +0000 UTC m=+2.076244773,LastTimestamp:2026-03-18 13:03:01.754798172 +0000 UTC m=+30.214225597,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.617575 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df1214fb5ca28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df1214fb5ca28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:33.627372072 +0000 UTC m=+2.086799487,LastTimestamp:2026-03-18 13:03:01.769300891 +0000 UTC m=+30.228728316,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.624095 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df125744fc7eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 13:03:16 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-controller-manager-crc.189df125744fc7eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:03:16 crc kubenswrapper[4912]: body: Mar 18 13:03:16 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421313003 +0000 UTC m=+19.880740508,LastTimestamp:2026-03-18 13:03:11.420979517 +0000 UTC m=+39.880406972,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:16 crc kubenswrapper[4912]: > Mar 18 13:03:16 crc kubenswrapper[4912]: E0318 13:03:16.627686 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df125745197f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df125745197f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421431795 +0000 UTC m=+19.880859260,LastTimestamp:2026-03-18 13:03:11.42110099 +0000 UTC m=+39.880528445,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:03:17 crc kubenswrapper[4912]: I0318 13:03:17.165467 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:17 crc kubenswrapper[4912]: W0318 13:03:17.747083 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 13:03:17 crc kubenswrapper[4912]: E0318 13:03:17.747188 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 13:03:18 crc kubenswrapper[4912]: I0318 13:03:18.164305 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:19 crc kubenswrapper[4912]: I0318 13:03:19.164181 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.167956 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:20 crc kubenswrapper[4912]: E0318 13:03:20.469303 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.470517 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.471991 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.472089 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.472111 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:20 crc kubenswrapper[4912]: I0318 13:03:20.472158 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:20 crc kubenswrapper[4912]: E0318 13:03:20.479517 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 13:03:21 crc kubenswrapper[4912]: I0318 13:03:21.167275 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:21 crc kubenswrapper[4912]: I0318 13:03:21.421778 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:03:21 crc kubenswrapper[4912]: I0318 13:03:21.421931 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:03:21 crc kubenswrapper[4912]: E0318 13:03:21.428338 4912 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df125744fc7eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 13:03:21 crc kubenswrapper[4912]: &Event{ObjectMeta:{kube-controller-manager-crc.189df125744fc7eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:03:21 crc kubenswrapper[4912]: body: Mar 18 13:03:21 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:02:51.421313003 +0000 UTC m=+19.880740508,LastTimestamp:2026-03-18 13:03:21.421875637 +0000 UTC m=+49.881303102,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:03:21 crc kubenswrapper[4912]: > Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.168074 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.227539 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.229631 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.229699 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.229721 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.230758 4912 scope.go:117] "RemoveContainer" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" Mar 18 13:03:22 crc kubenswrapper[4912]: W0318 13:03:22.294917 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 13:03:22 crc kubenswrapper[4912]: E0318 13:03:22.295004 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 13:03:22 crc kubenswrapper[4912]: E0318 13:03:22.320262 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.480920 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.482983 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b"} Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.483162 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.484028 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.484074 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.484083 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:22 crc kubenswrapper[4912]: I0318 13:03:22.889123 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.168088 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.354197 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.354457 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.356234 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.356310 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.356335 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.489168 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.489968 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.492352 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" exitCode=255 Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.492410 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b"} Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.492466 4912 scope.go:117] "RemoveContainer" containerID="15584f3d1fbe5ea215f857f0e6465c4e412c658f9054ed28340027a723bfe137" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.492514 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.493761 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.493789 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.493800 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:23 crc kubenswrapper[4912]: I0318 13:03:23.494586 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:03:23 crc kubenswrapper[4912]: E0318 13:03:23.494777 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.164166 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.497189 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.499538 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.501085 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.501138 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.501150 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:24 crc kubenswrapper[4912]: I0318 13:03:24.501903 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:03:24 crc kubenswrapper[4912]: E0318 13:03:24.502188 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:25 crc kubenswrapper[4912]: I0318 13:03:25.167975 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:26 crc kubenswrapper[4912]: I0318 13:03:26.168241 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.165169 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:27 crc kubenswrapper[4912]: E0318 13:03:27.475931 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.479956 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.481652 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.481700 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.481716 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:27 crc kubenswrapper[4912]: I0318 13:03:27.481749 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:27 crc kubenswrapper[4912]: E0318 13:03:27.487765 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.164886 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.313186 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.313413 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.314900 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.314956 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.314970 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.315675 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:03:28 crc kubenswrapper[4912]: E0318 13:03:28.315904 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.424820 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.425555 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.426868 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.426914 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.426926 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.430441 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.509817 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.510821 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.510869 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:28 crc kubenswrapper[4912]: I0318 13:03:28.510882 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:29 crc kubenswrapper[4912]: I0318 13:03:29.164634 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:30 crc kubenswrapper[4912]: I0318 13:03:30.165090 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:31 crc kubenswrapper[4912]: I0318 13:03:31.163508 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:32 crc kubenswrapper[4912]: I0318 13:03:32.165390 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:32 crc kubenswrapper[4912]: E0318 13:03:32.320883 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:33 crc kubenswrapper[4912]: I0318 13:03:33.165292 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.165233 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:34 crc kubenswrapper[4912]: E0318 13:03:34.484085 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.488080 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.490832 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.490936 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.490953 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:34 crc kubenswrapper[4912]: I0318 13:03:34.491025 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:34 crc kubenswrapper[4912]: E0318 13:03:34.497575 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 13:03:35 crc kubenswrapper[4912]: I0318 13:03:35.165123 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:36 crc kubenswrapper[4912]: I0318 13:03:36.165489 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:37 crc kubenswrapper[4912]: I0318 13:03:37.165844 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:38 crc kubenswrapper[4912]: I0318 13:03:38.165696 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:39 crc kubenswrapper[4912]: I0318 13:03:39.164029 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:40 crc kubenswrapper[4912]: I0318 13:03:40.164827 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.165358 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:41 crc kubenswrapper[4912]: E0318 13:03:41.489136 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.498257 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.499385 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.499421 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.499431 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:41 crc kubenswrapper[4912]: I0318 13:03:41.499456 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:41 crc kubenswrapper[4912]: E0318 13:03:41.505105 4912 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 13:03:42 crc kubenswrapper[4912]: I0318 13:03:42.164456 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:42 crc kubenswrapper[4912]: I0318 13:03:42.227732 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:42 crc kubenswrapper[4912]: I0318 13:03:42.228865 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:42 crc kubenswrapper[4912]: I0318 13:03:42.228907 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:42 crc kubenswrapper[4912]: I0318 13:03:42.228942 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:42 crc kubenswrapper[4912]: E0318 13:03:42.321835 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:42 crc kubenswrapper[4912]: W0318 13:03:42.747013 4912 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:42 crc kubenswrapper[4912]: E0318 13:03:42.747104 4912 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.165162 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.228008 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.229311 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.229360 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.229370 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.230070 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:03:43 crc kubenswrapper[4912]: E0318 13:03:43.230267 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.304779 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:03:43 crc kubenswrapper[4912]: I0318 13:03:43.317759 4912 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 13:03:44 crc kubenswrapper[4912]: I0318 13:03:44.165604 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:45 crc kubenswrapper[4912]: I0318 13:03:45.165677 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:46 crc kubenswrapper[4912]: I0318 13:03:46.164857 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:47 crc kubenswrapper[4912]: I0318 13:03:47.164201 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.166370 4912 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.376391 4912 csr.go:261] certificate signing request csr-rbz4s is approved, waiting to be issued Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.397067 4912 csr.go:257] certificate signing request csr-rbz4s is issued Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.443378 4912 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.506431 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.508653 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.508723 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.508742 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.508923 4912 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.520924 4912 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.521442 4912 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.521488 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.526686 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.526743 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.526757 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.526784 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.526799 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:48Z","lastTransitionTime":"2026-03-18T13:03:48Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.539185 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.548430 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.548480 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.548494 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.548520 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.548531 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:48Z","lastTransitionTime":"2026-03-18T13:03:48Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.559502 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.568222 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.568281 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.568297 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.568326 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.568345 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:48Z","lastTransitionTime":"2026-03-18T13:03:48Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.580745 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.591346 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.591402 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.591414 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.591440 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:48 crc kubenswrapper[4912]: I0318 13:03:48.591455 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:48Z","lastTransitionTime":"2026-03-18T13:03:48Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.610591 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:48Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.610754 4912 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.610792 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.711613 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.812390 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:48 crc kubenswrapper[4912]: E0318 13:03:48.913327 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: I0318 13:03:49.007393 4912 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.013636 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.113836 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.214214 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.314492 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: I0318 13:03:49.399659 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 06:40:11.400136457 +0000 UTC Mar 18 13:03:49 crc kubenswrapper[4912]: I0318 13:03:49.399725 4912 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5945h36m22.00041546s for next certificate rotation Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.415260 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.515811 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.616019 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.716598 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.817830 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:49 crc kubenswrapper[4912]: E0318 13:03:49.918251 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.019405 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.119677 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.220863 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.321682 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.421869 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.522387 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.622780 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.723644 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.823951 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:50 crc kubenswrapper[4912]: E0318 13:03:50.924934 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.025919 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.127080 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.227599 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.327801 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.428611 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.528772 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.629983 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.730626 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.831854 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:51 crc kubenswrapper[4912]: E0318 13:03:51.932829 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.033194 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.133651 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.234154 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.323030 4912 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.335187 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.435955 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.536354 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.637435 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.737650 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.838536 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:52 crc kubenswrapper[4912]: E0318 13:03:52.938696 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.039124 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.139344 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.239791 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.340799 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.440959 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.541865 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.642713 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.743409 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.843540 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:53 crc kubenswrapper[4912]: E0318 13:03:53.943647 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.044262 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.145320 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: I0318 13:03:54.227740 4912 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:03:54 crc kubenswrapper[4912]: I0318 13:03:54.229318 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:54 crc kubenswrapper[4912]: I0318 13:03:54.229357 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:54 crc kubenswrapper[4912]: I0318 13:03:54.229373 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:54 crc kubenswrapper[4912]: I0318 13:03:54.230085 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.230283 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.245726 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.346014 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.447163 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.547734 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.648320 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.749152 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.850299 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:54 crc kubenswrapper[4912]: E0318 13:03:54.950659 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.051178 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.152282 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.253343 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.354524 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.455299 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.555678 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.656561 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.756890 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.857865 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:55 crc kubenswrapper[4912]: E0318 13:03:55.959091 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.060359 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.161126 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.262032 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.363098 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.464170 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.565163 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.665394 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.765560 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.866056 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:56 crc kubenswrapper[4912]: E0318 13:03:56.967153 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.067978 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.168808 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.269471 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.370315 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.470520 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.570957 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.671450 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: I0318 13:03:57.685535 4912 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.772321 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.872719 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:57 crc kubenswrapper[4912]: E0318 13:03:57.973707 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.074161 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.174920 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.275151 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.376095 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.476824 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.577567 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.626581 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.632809 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.632885 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.632903 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.632928 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.632948 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:58Z","lastTransitionTime":"2026-03-18T13:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.647487 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.652211 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.652258 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.652272 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.652293 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.652308 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:58Z","lastTransitionTime":"2026-03-18T13:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.660893 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.664773 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.664815 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.664833 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.664857 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.664875 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:58Z","lastTransitionTime":"2026-03-18T13:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.676435 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.682713 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.682776 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.682793 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.682819 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:03:58 crc kubenswrapper[4912]: I0318 13:03:58.682838 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:03:58Z","lastTransitionTime":"2026-03-18T13:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.695312 4912 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T13:03:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b67b4615-c409-4182-8457-37817034d738\\\",\\\"systemUUID\\\":\\\"e10d31e1-6845-4aa5-a90d-99ca9bbe0732\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.695645 4912 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.695699 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.796802 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.897775 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:58 crc kubenswrapper[4912]: E0318 13:03:58.998199 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.098364 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.198976 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.299929 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.400157 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.500365 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.600871 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.701330 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.801539 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:03:59 crc kubenswrapper[4912]: E0318 13:03:59.901673 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.002765 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.103123 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.204165 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.249897 4912 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.304358 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.404576 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.504816 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.605328 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.705999 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: E0318 13:04:00.807156 4912 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.847905 4912 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.909814 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.909875 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.909900 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.909930 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:00 crc kubenswrapper[4912]: I0318 13:04:00.909953 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:00Z","lastTransitionTime":"2026-03-18T13:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.013461 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.013515 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.013527 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.013544 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.013560 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.116603 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.116652 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.116665 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.116688 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.116703 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.194739 4912 apiserver.go:52] "Watching apiserver" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200109 4912 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200393 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200805 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200959 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200996 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.201119 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.200955 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.201187 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.201204 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.201253 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.201287 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.203551 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.203556 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.203756 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.203939 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.204711 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.205422 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.205865 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.205960 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.206002 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.219460 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.219513 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.219530 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.219552 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.219569 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.229735 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.247326 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.259277 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.269878 4912 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.270849 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.280494 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.291294 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.302709 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320221 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320285 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320309 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320334 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320354 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320374 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320393 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320412 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320431 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320451 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320472 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320492 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320510 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320544 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320567 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320591 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320611 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320633 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320653 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320675 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320700 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320722 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320732 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320846 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320851 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320878 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320962 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320986 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.320984 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321005 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321076 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321094 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321115 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321130 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321147 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321163 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321178 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321192 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321264 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321284 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321303 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321333 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321354 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321374 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321390 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321407 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321421 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321436 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321451 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321466 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321486 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321502 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321518 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321532 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321548 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321568 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321593 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321614 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321638 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321661 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321682 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321703 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321720 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321737 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321754 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321768 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321782 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321799 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321817 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321832 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321849 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321864 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321883 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321899 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321917 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321937 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321958 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321975 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321990 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322007 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322022 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322053 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322069 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322084 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322100 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322116 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322133 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322150 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322170 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322185 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322200 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322214 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322233 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322248 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322264 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322279 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322294 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322310 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322326 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322347 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322371 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322399 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322423 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322449 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322475 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322500 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322524 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322548 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322573 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322591 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322606 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322623 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322639 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322656 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322672 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322688 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322705 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322721 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322736 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322735 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322772 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322787 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323026 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323069 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322752 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323260 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323297 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323330 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323368 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323404 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323437 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323470 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323506 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323538 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323572 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323605 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323638 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323672 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323705 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323737 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323770 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323802 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323835 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323870 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323902 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323933 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323956 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323990 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324013 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324035 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324091 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324115 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324143 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324178 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324209 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324238 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324270 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324302 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324336 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324361 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324383 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324429 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324460 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324482 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324513 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324548 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324572 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324637 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324673 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324706 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324805 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324845 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324878 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324914 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324950 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324984 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325014 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325070 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325104 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325136 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325168 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325204 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325237 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325272 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325307 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325330 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325358 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325391 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325426 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325457 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325488 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325523 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325557 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325583 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325610 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325632 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325656 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325678 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325705 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325740 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325774 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325797 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325842 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325873 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325901 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325926 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325953 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325979 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326003 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326060 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326101 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326125 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326150 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326177 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326205 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326229 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326282 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326298 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326312 4912 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321069 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321457 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321492 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321603 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321639 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321847 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.321981 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322131 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322479 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322490 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322510 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322781 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322829 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323257 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328741 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323299 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323437 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323467 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323515 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323739 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.323825 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324029 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324070 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324295 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324369 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324629 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324655 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324689 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324714 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.322753 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.324887 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328984 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325230 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325540 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325705 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325760 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325855 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.325879 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.329090 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326216 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326337 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326338 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.326574 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.327124 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.329474 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:01.829440573 +0000 UTC m=+90.288868008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.327303 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.327687 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.327902 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328003 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328008 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328146 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328197 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328391 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328326 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.328711 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.329695 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.329786 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.330247 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.330683 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.330694 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.331148 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.331329 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.331462 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.331846 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.332653 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.332797 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333059 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333222 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333136 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333294 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333362 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333475 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333665 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.333835 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334029 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334164 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334381 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334379 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334460 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334576 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.334669 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335011 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335113 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335228 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335240 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335407 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335758 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335809 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335967 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.335998 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.336306 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.337775 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:01.837753673 +0000 UTC m=+90.297181098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337066 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337054 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.336492 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.336516 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.336603 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338246 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.336895 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337090 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337115 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337367 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337410 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.338381 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337929 4912 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338725 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338390 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337685 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.338464 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:01.83844228 +0000 UTC m=+90.297869715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.341519 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.337897 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338210 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338760 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.338972 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339080 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339202 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339560 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339580 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339610 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339774 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.339972 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340086 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340262 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340368 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340377 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340592 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.340941 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.341255 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.341467 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.341797 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.342328 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.342339 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.342579 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343204 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343209 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343451 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343664 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343753 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343797 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.343897 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.344127 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.344352 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.344962 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.345346 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.345391 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.346324 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.346451 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.346903 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.347243 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.347331 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.347638 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.348173 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.348238 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.348457 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.350077 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.352324 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.353567 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.353677 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.353739 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.353757 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.353850 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:01.8538265 +0000 UTC m=+90.313253925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.361350 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.361795 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.361895 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.361882 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.362464 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.362698 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.362614 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.362782 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.362778 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.362834 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.362905 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:01.862885858 +0000 UTC m=+90.322313283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.363096 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.363159 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.363408 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.363821 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.363923 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.367142 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.367348 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.367349 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.367555 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.369348 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.370820 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.370872 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.371444 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.373735 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.373815 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.374016 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.374151 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.374755 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.374914 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.375213 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.378617 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.381632 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.383844 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.384849 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.384879 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.385253 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.385304 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.386314 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.388992 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.389344 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.389593 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.390744 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.391253 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.391639 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.392146 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.392627 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.392961 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.405134 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.412225 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426381 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426436 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426452 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426470 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426482 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426655 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426762 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426856 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426870 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426881 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426892 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426904 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426917 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426931 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426942 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426956 4912 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426968 4912 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426979 4912 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.426990 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427001 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427013 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427024 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427058 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427071 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427082 4912 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427093 4912 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427123 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427143 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427153 4912 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427163 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427172 4912 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427182 4912 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427191 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427200 4912 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427208 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427216 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427224 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427234 4912 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427242 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427253 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427265 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427271 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427278 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427289 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427299 4912 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427308 4912 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427316 4912 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427327 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427336 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427344 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427345 4912 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427401 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427412 4912 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427422 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427445 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427461 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427474 4912 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427484 4912 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427492 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427502 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427510 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427519 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427527 4912 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427537 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427546 4912 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427554 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427563 4912 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427572 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427580 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427589 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427597 4912 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427606 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427618 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427626 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427636 4912 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427644 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427653 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427661 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427670 4912 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427680 4912 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427688 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427696 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427704 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427712 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427753 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427772 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427785 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427796 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427808 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427823 4912 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427837 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.427849 4912 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428021 4912 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428122 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428165 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428180 4912 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428198 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428214 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428228 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428243 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428257 4912 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428270 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428286 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428300 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428314 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428327 4912 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428339 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428352 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428364 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428376 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428389 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428402 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428414 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428427 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428439 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428451 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428464 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428475 4912 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428487 4912 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428499 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428511 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428522 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428534 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428547 4912 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428559 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428571 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428586 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428598 4912 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428609 4912 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428622 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428634 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428646 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428656 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428668 4912 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428679 4912 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428691 4912 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428703 4912 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428714 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428728 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428739 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428753 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428764 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428776 4912 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428786 4912 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428798 4912 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428809 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428820 4912 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428832 4912 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428842 4912 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428854 4912 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428865 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428876 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428888 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428899 4912 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428909 4912 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428920 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428931 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428942 4912 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428955 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428967 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428979 4912 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.428992 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429003 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429015 4912 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429050 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429064 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429076 4912 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429092 4912 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429106 4912 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429117 4912 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429129 4912 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429142 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429154 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429168 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429183 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429199 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429214 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429231 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429245 4912 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429260 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429276 4912 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429291 4912 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429308 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429322 4912 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429335 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429351 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429363 4912 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429374 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429387 4912 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429402 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429418 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429434 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429450 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429464 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429476 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429489 4912 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429501 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429513 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429525 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.429537 4912 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.532297 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.532361 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.532384 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.532400 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.532434 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.552892 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.566468 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.573117 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 13:04:01 crc kubenswrapper[4912]: W0318 13:04:01.581365 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c0a6aa4cfc50e40004079a714e764ecbfb62c3737e6ce7f1647015e46727ac00 WatchSource:0}: Error finding container c0a6aa4cfc50e40004079a714e764ecbfb62c3737e6ce7f1647015e46727ac00: Status 404 returned error can't find the container with id c0a6aa4cfc50e40004079a714e764ecbfb62c3737e6ce7f1647015e46727ac00 Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.601152 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7c145ac1805b468f3f76ffa3d7b0d02140e939e42491b78ff114ee350e484390"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.601797 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c0a6aa4cfc50e40004079a714e764ecbfb62c3737e6ce7f1647015e46727ac00"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.602483 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cb90cbcd2597ef79de6d7e4cfc1567b66447dd6e2b6da502c7c9609113c872ab"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.637677 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.637732 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.637747 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.637769 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.637783 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.739547 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.739584 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.739595 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.739610 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.739622 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.838547 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.838662 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.838778 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:02.838736457 +0000 UTC m=+91.298163972 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.838798 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.838902 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:02.83889189 +0000 UTC m=+91.298319575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.838895 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.838986 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.839091 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:02.839067945 +0000 UTC m=+91.298495400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.843359 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.843400 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.843411 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.843426 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.843438 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.940025 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.940136 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940206 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940224 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940230 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940240 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940248 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940252 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940304 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:02.9402861 +0000 UTC m=+91.399713525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: E0318 13:04:01.940323 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:02.940316011 +0000 UTC m=+91.399743436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.945692 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.945759 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.945775 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.945789 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:01 crc kubenswrapper[4912]: I0318 13:04:01.945801 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:01Z","lastTransitionTime":"2026-03-18T13:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.048258 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.048308 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.048323 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.048342 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.048356 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.150967 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.151030 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.151092 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.151122 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.151142 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.232622 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.233231 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.234672 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.237743 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.240243 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.240786 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.241579 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.242290 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.243417 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.244178 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.245281 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.245852 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.247300 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.247966 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.248606 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.249762 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.250434 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.251720 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.252219 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.252906 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.254182 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.254735 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.254896 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255508 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255567 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255581 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255603 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255618 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.255883 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.256369 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.257471 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.257926 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.258546 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.259630 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.260163 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.261063 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.261584 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.262429 4912 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.262532 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.264597 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.265566 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.266134 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.266815 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.267762 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.268578 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.269613 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.270259 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.271507 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.272704 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.273687 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.274331 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.275516 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.275971 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.277064 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.277683 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.279057 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.279302 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.279541 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.280578 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.281112 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.282232 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.282895 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.283479 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.293377 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.306978 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.358804 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.358882 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.358893 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.358906 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.358938 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.461783 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.461838 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.461847 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.461863 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.461873 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.565062 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.565119 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.565133 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.565155 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.565179 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.607338 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4df70b0f4db3e25f9313a67ee6a41ca10059981e2b6b12ee73b66f91d2bb5ea8"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.607559 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e95c0e7be4d5b4ebc74f2ca465836283d02c2065203c5a6a6e1e94ab20ec1f3b"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.609265 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bd9fd9409376b1980c0c5ead7190ac0d45fc9056d871072f98f36a6294b0a1df"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.622648 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.636905 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4df70b0f4db3e25f9313a67ee6a41ca10059981e2b6b12ee73b66f91d2bb5ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95c0e7be4d5b4ebc74f2ca465836283d02c2065203c5a6a6e1e94ab20ec1f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.652241 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.663005 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.667547 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.667586 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.667595 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.667641 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.667653 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.675621 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.687002 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.698220 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd9fd9409376b1980c0c5ead7190ac0d45fc9056d871072f98f36a6294b0a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.709138 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.722200 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.733417 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.744608 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.758293 4912 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4df70b0f4db3e25f9313a67ee6a41ca10059981e2b6b12ee73b66f91d2bb5ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95c0e7be4d5b4ebc74f2ca465836283d02c2065203c5a6a6e1e94ab20ec1f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T13:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.769974 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.770019 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.770032 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.770073 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.770088 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.848056 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.848136 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.848165 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.848235 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.848267 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.848273 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:04.848245236 +0000 UTC m=+93.307672661 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.848343 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:04.848326138 +0000 UTC m=+93.307753583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.848359 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:04.848351058 +0000 UTC m=+93.307778503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.873104 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.873147 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.873156 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.873172 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.873181 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.949465 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.949501 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949612 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949625 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949637 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949684 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:04.949671086 +0000 UTC m=+93.409098501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949684 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949726 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949742 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:02 crc kubenswrapper[4912]: E0318 13:04:02.949803 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:04.949784039 +0000 UTC m=+93.409211464 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.976136 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.976198 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.976207 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.976222 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:02 crc kubenswrapper[4912]: I0318 13:04:02.976232 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:02Z","lastTransitionTime":"2026-03-18T13:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.078276 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.078323 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.078338 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.078360 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.078372 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.181207 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.181245 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.181257 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.181272 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.181284 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.227028 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.227160 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:03 crc kubenswrapper[4912]: E0318 13:04:03.227279 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.227345 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:03 crc kubenswrapper[4912]: E0318 13:04:03.227481 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:03 crc kubenswrapper[4912]: E0318 13:04:03.227618 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.288058 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.288097 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.288106 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.288120 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.288130 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.391129 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.391162 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.391171 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.391184 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.391195 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.493205 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.493245 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.493252 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.493267 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.493277 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.595307 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.595350 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.595361 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.595377 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.595387 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.697136 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.697180 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.697193 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.697210 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.697223 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.799582 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.799625 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.799661 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.799679 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.799690 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.902391 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.902465 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.902488 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.902521 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:03 crc kubenswrapper[4912]: I0318 13:04:03.902542 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:03Z","lastTransitionTime":"2026-03-18T13:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.004812 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.004886 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.004902 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.004917 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.004927 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.107725 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.107757 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.107764 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.107776 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.107785 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.211194 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.211284 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.211299 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.211324 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.211344 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.314063 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.314118 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.314131 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.314150 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.314164 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.415972 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.416014 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.416025 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.416068 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.416081 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.518756 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.518823 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.518835 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.518851 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.518863 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.615817 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e2ede33bf1764d708e4895344e540ea4431b9cb98873b932316571df572e4a43"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.620498 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.620549 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.620566 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.620586 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.620601 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.723016 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.723081 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.723093 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.723115 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.723128 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.825438 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.825494 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.825503 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.825521 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.825531 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.868270 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.868415 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.868460 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.868565 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.868638 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:08.868578604 +0000 UTC m=+97.328006069 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.868695 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:08.868673186 +0000 UTC m=+97.328100661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.868717 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.868870 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:08.86884015 +0000 UTC m=+97.328267585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.928383 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.928451 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.928465 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.928490 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.928504 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:04Z","lastTransitionTime":"2026-03-18T13:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.969008 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:04 crc kubenswrapper[4912]: I0318 13:04:04.969079 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969217 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969234 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969246 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969302 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:08.969285777 +0000 UTC m=+97.428713202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969341 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969394 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969414 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:04 crc kubenswrapper[4912]: E0318 13:04:04.969514 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:08.969481302 +0000 UTC m=+97.428908737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.031496 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.031593 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.031617 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.031650 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.031672 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.134535 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.134589 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.134603 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.134621 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.134633 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.164899 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2wcxr"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.165288 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.167126 4912 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.167181 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.167287 4912 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.167366 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.167949 4912 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.167987 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.178899 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vsp6g"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.179347 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.181796 4912 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.181862 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.182114 4912 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.182146 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.182198 4912 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.182211 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.182554 4912 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.182571 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.182606 4912 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.182617 4912 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.184614 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4m6dz"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.185640 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.188398 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gdg7r"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.189079 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.189190 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.189088 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.190595 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.190995 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.192650 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.192935 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.192985 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.202219 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sns58"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.204369 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.207553 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.210421 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.211672 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.211701 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.211702 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.212248 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.212490 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.226835 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.226975 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.227444 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.227480 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.227504 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.227633 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.237635 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.237669 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.237678 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.237693 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.237707 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.243874 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.244410 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271776 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-hostroot\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271822 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-bin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271848 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkd78\" (UniqueName: \"kubernetes.io/projected/7799bd07-fe62-47e1-b738-e097930474f1-kube-api-access-rkd78\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271868 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-socket-dir-parent\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271934 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-k8s-cni-cncf-io\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.271984 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6jf\" (UniqueName: \"kubernetes.io/projected/1b4e18f7-a93f-463f-a208-2002cdf73919-kube-api-access-sw6jf\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272024 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-os-release\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272076 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-conf-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272113 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-netns\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272160 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-kubelet\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272188 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-cnibin\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272217 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272243 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272311 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-cni-binary-copy\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272342 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272373 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-etc-kubernetes\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272418 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-os-release\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272484 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-multus\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272512 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-system-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272536 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-daemon-config\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272576 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mks6c\" (UniqueName: \"kubernetes.io/projected/b8c4ee9c-d987-4db8-904a-8b903b0894aa-kube-api-access-mks6c\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272602 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272630 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0c45cd5-793c-419f-8fe6-a2239050972e-mcd-auth-proxy-config\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272648 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-cnibin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272667 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c0c45cd5-793c-419f-8fe6-a2239050972e-rootfs\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272822 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-multus-certs\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272848 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-system-cni-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272873 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8c4ee9c-d987-4db8-904a-8b903b0894aa-hosts-file\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.272907 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkrc\" (UniqueName: \"kubernetes.io/projected/c0c45cd5-793c-419f-8fe6-a2239050972e-kube-api-access-fkkrc\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.342575 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.342620 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.342632 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.342645 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.342657 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373627 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-socket-dir-parent\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373666 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-k8s-cni-cncf-io\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373686 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkd78\" (UniqueName: \"kubernetes.io/projected/7799bd07-fe62-47e1-b738-e097930474f1-kube-api-access-rkd78\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373710 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373720 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-socket-dir-parent\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373736 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6jf\" (UniqueName: \"kubernetes.io/projected/1b4e18f7-a93f-463f-a208-2002cdf73919-kube-api-access-sw6jf\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373760 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373806 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-k8s-cni-cncf-io\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.373986 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374056 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374084 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374165 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-os-release\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374203 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374283 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-os-release\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374357 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-conf-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374405 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-conf-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374460 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374534 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-cni-binary-copy\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374561 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-netns\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374585 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-kubelet\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374608 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-cnibin\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374656 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374687 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374712 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374735 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374762 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374784 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-etc-kubernetes\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374807 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-os-release\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374832 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374859 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374902 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-multus\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374924 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374965 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-system-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.374991 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-daemon-config\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375205 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-etc-kubernetes\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375014 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375256 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0c45cd5-793c-419f-8fe6-a2239050972e-mcd-auth-proxy-config\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375277 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-os-release\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375290 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-cnibin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375319 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mks6c\" (UniqueName: \"kubernetes.io/projected/b8c4ee9c-d987-4db8-904a-8b903b0894aa-kube-api-access-mks6c\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375347 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375369 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375394 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c0c45cd5-793c-419f-8fe6-a2239050972e-rootfs\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375412 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-kubelet\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375418 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375418 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-cni-binary-copy\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375510 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375532 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-netns\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375553 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkrc\" (UniqueName: \"kubernetes.io/projected/c0c45cd5-793c-419f-8fe6-a2239050972e-kube-api-access-fkkrc\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-multus-certs\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375620 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-system-cni-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375647 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8c4ee9c-d987-4db8-904a-8b903b0894aa-hosts-file\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375681 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375689 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-system-cni-dir\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375708 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375767 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-hostroot\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375794 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375820 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbjdr\" (UniqueName: \"kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375852 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-bin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375904 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375961 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-bin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375967 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-cnibin\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376207 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c0c45cd5-793c-419f-8fe6-a2239050972e-rootfs\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376242 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-system-cni-dir\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376428 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-run-multus-certs\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376431 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.375621 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-host-var-lib-cni-multus\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376510 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b4e18f7-a93f-463f-a208-2002cdf73919-hostroot\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376517 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8c4ee9c-d987-4db8-904a-8b903b0894aa-hosts-file\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.376724 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7799bd07-fe62-47e1-b738-e097930474f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.377341 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7799bd07-fe62-47e1-b738-e097930474f1-cnibin\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.377637 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b4e18f7-a93f-463f-a208-2002cdf73919-multus-daemon-config\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.400015 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkd78\" (UniqueName: \"kubernetes.io/projected/7799bd07-fe62-47e1-b738-e097930474f1-kube-api-access-rkd78\") pod \"multus-additional-cni-plugins-4m6dz\" (UID: \"7799bd07-fe62-47e1-b738-e097930474f1\") " pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.401273 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6jf\" (UniqueName: \"kubernetes.io/projected/1b4e18f7-a93f-463f-a208-2002cdf73919-kube-api-access-sw6jf\") pod \"multus-gdg7r\" (UID: \"1b4e18f7-a93f-463f-a208-2002cdf73919\") " pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.409810 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x7dhq"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.410214 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.414081 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.414546 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.415077 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.415905 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.445135 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.445201 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.445214 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.445235 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.445251 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476631 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476657 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476697 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476721 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476716 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476725 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbjdr\" (UniqueName: \"kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476776 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476826 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476803 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476936 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.476976 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477001 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477027 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477137 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477168 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477226 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477249 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477250 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477273 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477313 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477336 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477375 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477415 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477420 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477478 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477497 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477516 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.477585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478219 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478234 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478257 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478477 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478487 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478629 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.478718 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.481340 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.498458 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbjdr\" (UniqueName: \"kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr\") pod \"ovnkube-node-sns58\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.515101 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.529191 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7799bd07_fe62_47e1_b738_e097930474f1.slice/crio-1a5d2b7eb70cad1696ec2bdaefea7816361586852940c0ed70210ae6e608372d WatchSource:0}: Error finding container 1a5d2b7eb70cad1696ec2bdaefea7816361586852940c0ed70210ae6e608372d: Status 404 returned error can't find the container with id 1a5d2b7eb70cad1696ec2bdaefea7816361586852940c0ed70210ae6e608372d Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.530676 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gdg7r" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.539397 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.548111 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.548139 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.548148 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.548165 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.548176 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.549989 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4e18f7_a93f_463f_a208_2002cdf73919.slice/crio-22fc618a1c52254db6a0a08c6268533d1f35442ca2a1817f487ae22e06054900 WatchSource:0}: Error finding container 22fc618a1c52254db6a0a08c6268533d1f35442ca2a1817f487ae22e06054900: Status 404 returned error can't find the container with id 22fc618a1c52254db6a0a08c6268533d1f35442ca2a1817f487ae22e06054900 Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.572465 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5fc3074_5b30_4c2d_ae24_dfa5de9b835b.slice/crio-c3a41ec619bf924cc9836633f286ebf282abcd38faa1f015861d1e6dc11fc903 WatchSource:0}: Error finding container c3a41ec619bf924cc9836633f286ebf282abcd38faa1f015861d1e6dc11fc903: Status 404 returned error can't find the container with id c3a41ec619bf924cc9836633f286ebf282abcd38faa1f015861d1e6dc11fc903 Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.577948 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5fffc6e-8974-427e-a8a6-3924a6774829-serviceca\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.577987 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5fffc6e-8974-427e-a8a6-3924a6774829-host\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.578063 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2jfv\" (UniqueName: \"kubernetes.io/projected/d5fffc6e-8974-427e-a8a6-3924a6774829-kube-api-access-c2jfv\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.594914 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.595476 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.597189 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.599396 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.617116 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q4ppq"] Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.617782 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.617927 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.627077 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerStarted","Data":"1a5d2b7eb70cad1696ec2bdaefea7816361586852940c0ed70210ae6e608372d"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.629438 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.633778 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.634939 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.636382 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"c3a41ec619bf924cc9836633f286ebf282abcd38faa1f015861d1e6dc11fc903"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.640233 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gdg7r" event={"ID":"1b4e18f7-a93f-463f-a208-2002cdf73919","Type":"ContainerStarted","Data":"22fc618a1c52254db6a0a08c6268533d1f35442ca2a1817f487ae22e06054900"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.652063 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.652114 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.652127 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.652147 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.652164 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679716 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679758 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd6hh\" (UniqueName: \"kubernetes.io/projected/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-kube-api-access-gd6hh\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679791 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679840 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679901 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2jfv\" (UniqueName: \"kubernetes.io/projected/d5fffc6e-8974-427e-a8a6-3924a6774829-kube-api-access-c2jfv\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679963 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5fffc6e-8974-427e-a8a6-3924a6774829-serviceca\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.679984 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5fffc6e-8974-427e-a8a6-3924a6774829-host\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.680082 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5fffc6e-8974-427e-a8a6-3924a6774829-host\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.681207 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5fffc6e-8974-427e-a8a6-3924a6774829-serviceca\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.698635 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2jfv\" (UniqueName: \"kubernetes.io/projected/d5fffc6e-8974-427e-a8a6-3924a6774829-kube-api-access-c2jfv\") pod \"node-ca-x7dhq\" (UID: \"d5fffc6e-8974-427e-a8a6-3924a6774829\") " pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.747166 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x7dhq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.754883 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.754923 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.754938 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.754957 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.754968 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781069 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzmn2\" (UniqueName: \"kubernetes.io/projected/1a9fc2ce-3a71-465b-823d-5b1af71d635c-kube-api-access-nzmn2\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781183 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781226 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781250 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd6hh\" (UniqueName: \"kubernetes.io/projected/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-kube-api-access-gd6hh\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781281 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.781913 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.782279 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.784171 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.797072 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd6hh\" (UniqueName: \"kubernetes.io/projected/8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78-kube-api-access-gd6hh\") pod \"ovnkube-control-plane-749d76644c-jjlql\" (UID: \"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.857849 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.857910 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.857924 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.857943 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.857955 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.882881 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzmn2\" (UniqueName: \"kubernetes.io/projected/1a9fc2ce-3a71-465b-823d-5b1af71d635c-kube-api-access-nzmn2\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.882959 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.883133 4912 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:05 crc kubenswrapper[4912]: E0318 13:04:05.883205 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs podName:1a9fc2ce-3a71-465b-823d-5b1af71d635c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:06.383187886 +0000 UTC m=+94.842615331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs") pod "network-metrics-daemon-q4ppq" (UID: "1a9fc2ce-3a71-465b-823d-5b1af71d635c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.900155 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzmn2\" (UniqueName: \"kubernetes.io/projected/1a9fc2ce-3a71-465b-823d-5b1af71d635c-kube-api-access-nzmn2\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.913128 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" Mar 18 13:04:05 crc kubenswrapper[4912]: W0318 13:04:05.927888 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8fca4f_3a3a_4a7f_ab6e_ded15ab37c78.slice/crio-fde8163242d1cacebcd0bd9df8a4b5561b52c519968dd0f2724b8936558f2046 WatchSource:0}: Error finding container fde8163242d1cacebcd0bd9df8a4b5561b52c519968dd0f2724b8936558f2046: Status 404 returned error can't find the container with id fde8163242d1cacebcd0bd9df8a4b5561b52c519968dd0f2724b8936558f2046 Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.959967 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.960011 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.960021 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.960051 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.960066 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:05Z","lastTransitionTime":"2026-03-18T13:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.996067 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 13:04:05 crc kubenswrapper[4912]: I0318 13:04:05.999851 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.051524 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.063227 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.063262 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.063272 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.063288 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.063303 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.099453 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.107755 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mks6c\" (UniqueName: \"kubernetes.io/projected/b8c4ee9c-d987-4db8-904a-8b903b0894aa-kube-api-access-mks6c\") pod \"node-resolver-2wcxr\" (UID: \"b8c4ee9c-d987-4db8-904a-8b903b0894aa\") " pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.165988 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.166017 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.166027 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.166055 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.166066 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.268352 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.268407 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.268416 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.268433 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.268447 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.297104 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.307297 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkrc\" (UniqueName: \"kubernetes.io/projected/c0c45cd5-793c-419f-8fe6-a2239050972e-kube-api-access-fkkrc\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.325577 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.326490 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0c45cd5-793c-419f-8fe6-a2239050972e-mcd-auth-proxy-config\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.372420 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.372472 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.372483 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.372508 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.372521 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: E0318 13:04:06.376245 4912 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 13:04:06 crc kubenswrapper[4912]: E0318 13:04:06.376321 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls podName:c0c45cd5-793c-419f-8fe6-a2239050972e nodeName:}" failed. No retries permitted until 2026-03-18 13:04:06.87629737 +0000 UTC m=+95.335724795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls") pod "machine-config-daemon-vsp6g" (UID: "c0c45cd5-793c-419f-8fe6-a2239050972e") : failed to sync secret cache: timed out waiting for the condition Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.378520 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wcxr" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.387347 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:06 crc kubenswrapper[4912]: E0318 13:04:06.387537 4912 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:06 crc kubenswrapper[4912]: E0318 13:04:06.387628 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs podName:1a9fc2ce-3a71-465b-823d-5b1af71d635c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:07.387601342 +0000 UTC m=+95.847028767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs") pod "network-metrics-daemon-q4ppq" (UID: "1a9fc2ce-3a71-465b-823d-5b1af71d635c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:06 crc kubenswrapper[4912]: W0318 13:04:06.392352 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c4ee9c_d987_4db8_904a_8b903b0894aa.slice/crio-58b4efac7a4b92f848b3bd8f5b9ffd20749a3778d65913dfb837448dd4299a3a WatchSource:0}: Error finding container 58b4efac7a4b92f848b3bd8f5b9ffd20749a3778d65913dfb837448dd4299a3a: Status 404 returned error can't find the container with id 58b4efac7a4b92f848b3bd8f5b9ffd20749a3778d65913dfb837448dd4299a3a Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.475183 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.475238 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.475250 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.475268 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.475281 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.577690 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.577730 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.577739 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.577753 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.577762 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.646339 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gdg7r" event={"ID":"1b4e18f7-a93f-463f-a208-2002cdf73919","Type":"ContainerStarted","Data":"aa30387b3d7b5eafd9e2361d21c1da8c6b47920a636403143a6e2a3d2e0e48c9"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.650174 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x7dhq" event={"ID":"d5fffc6e-8974-427e-a8a6-3924a6774829","Type":"ContainerStarted","Data":"eb69c2a2cc0e8e43a578e840203e0b1f651b51d59e253c285878344e1e333bf5"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.650256 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x7dhq" event={"ID":"d5fffc6e-8974-427e-a8a6-3924a6774829","Type":"ContainerStarted","Data":"4350ba388dfad0910fb833800c33482454e1379fe000a7024d48970ba8207545"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.653553 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="c52b03a6de15f95580fbf97f6340e58e0a48e6aa02e7684de32fd41ce749e436" exitCode=0 Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.653638 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"c52b03a6de15f95580fbf97f6340e58e0a48e6aa02e7684de32fd41ce749e436"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.656202 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wcxr" event={"ID":"b8c4ee9c-d987-4db8-904a-8b903b0894aa","Type":"ContainerStarted","Data":"6878300d971d1efd8a52ed1e6c51290e6ea36a5ac3f4d370b3db902aaab8c50c"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.656246 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wcxr" event={"ID":"b8c4ee9c-d987-4db8-904a-8b903b0894aa","Type":"ContainerStarted","Data":"58b4efac7a4b92f848b3bd8f5b9ffd20749a3778d65913dfb837448dd4299a3a"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.657692 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" event={"ID":"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78","Type":"ContainerStarted","Data":"ebfec0e370d9080efe9b68da6f5b03f3853275575458d6022326264077ef3c02"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.657726 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" event={"ID":"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78","Type":"ContainerStarted","Data":"1565e258f09eedd9a49f513c29a74489c284c28ef3dec363bff9302131b31c3e"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.657743 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" event={"ID":"8c8fca4f-3a3a-4a7f-ab6e-ded15ab37c78","Type":"ContainerStarted","Data":"fde8163242d1cacebcd0bd9df8a4b5561b52c519968dd0f2724b8936558f2046"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.659485 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="4189deed16c50c4b2235c0b6650a2425c744dcb9394b6a84293894dbf0176b2b" exitCode=0 Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.659632 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"4189deed16c50c4b2235c0b6650a2425c744dcb9394b6a84293894dbf0176b2b"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.670093 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=1.6700725379999999 podStartE2EDuration="1.670072538s" podCreationTimestamp="2026-03-18 13:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:05.655793305 +0000 UTC m=+94.115220730" watchObservedRunningTime="2026-03-18 13:04:06.670072538 +0000 UTC m=+95.129499963" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.670234 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gdg7r" podStartSLOduration=39.670230072 podStartE2EDuration="39.670230072s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:06.669345101 +0000 UTC m=+95.128772526" watchObservedRunningTime="2026-03-18 13:04:06.670230072 +0000 UTC m=+95.129657497" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.680393 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.680426 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.680436 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.680449 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.680458 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.686936 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.703748 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x7dhq" podStartSLOduration=39.703723228 podStartE2EDuration="39.703723228s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:06.685889019 +0000 UTC m=+95.145316464" watchObservedRunningTime="2026-03-18 13:04:06.703723228 +0000 UTC m=+95.163150663" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.703911 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2wcxr" podStartSLOduration=39.703904872 podStartE2EDuration="39.703904872s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:06.703123713 +0000 UTC m=+95.162551138" watchObservedRunningTime="2026-03-18 13:04:06.703904872 +0000 UTC m=+95.163332307" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.759722 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.778898 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jjlql" podStartSLOduration=39.778881386 podStartE2EDuration="39.778881386s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:06.742381318 +0000 UTC m=+95.201808763" watchObservedRunningTime="2026-03-18 13:04:06.778881386 +0000 UTC m=+95.238308811" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.785616 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.785650 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.785661 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.785676 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.785687 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.887888 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.887922 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.887932 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.887948 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.887958 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.891988 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.895538 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0c45cd5-793c-419f-8fe6-a2239050972e-proxy-tls\") pod \"machine-config-daemon-vsp6g\" (UID: \"c0c45cd5-793c-419f-8fe6-a2239050972e\") " pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.989620 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.989655 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.989665 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.989681 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.989694 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:06Z","lastTransitionTime":"2026-03-18T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:06 crc kubenswrapper[4912]: I0318 13:04:06.997474 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:04:07 crc kubenswrapper[4912]: W0318 13:04:07.008887 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c45cd5_793c_419f_8fe6_a2239050972e.slice/crio-222cadb559917f7c28e0017ad1fb6002ed986c79d50ef30f5599cc6a644dd789 WatchSource:0}: Error finding container 222cadb559917f7c28e0017ad1fb6002ed986c79d50ef30f5599cc6a644dd789: Status 404 returned error can't find the container with id 222cadb559917f7c28e0017ad1fb6002ed986c79d50ef30f5599cc6a644dd789 Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.092832 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.092883 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.092897 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.092916 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.092930 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.194955 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.195002 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.195017 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.195050 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.195066 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.227697 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.227718 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.227757 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.227831 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.227971 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.228124 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.228223 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.228371 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.298305 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.298357 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.298369 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.298389 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.298400 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.397280 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.397539 4912 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:07 crc kubenswrapper[4912]: E0318 13:04:07.397649 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs podName:1a9fc2ce-3a71-465b-823d-5b1af71d635c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:09.397624053 +0000 UTC m=+97.857051478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs") pod "network-metrics-daemon-q4ppq" (UID: "1a9fc2ce-3a71-465b-823d-5b1af71d635c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.402409 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.402463 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.402478 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.402504 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.402519 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.506127 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.506181 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.506194 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.506217 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.506231 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.609466 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.609521 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.609530 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.609546 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.609558 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.667081 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="d16b6026c16b3dd86187ec6239672d943f1dfbf2dcd8bc00cb1fb54f1468ab91" exitCode=0 Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.667144 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"d16b6026c16b3dd86187ec6239672d943f1dfbf2dcd8bc00cb1fb54f1468ab91"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674072 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"bca229aab67540876b235ab78ea602d1fb804e17f3dd261c91692cad71bc8042"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674137 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"1142fd434aefb47b095035cb7e6ea0b08251b72a986559eddb2a5d98975576bf"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674156 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"c4650c42543642637bb48d93405026a2703095327cf5b0f12a62d7a03c02ffec"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674171 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"d6ebbb1b430aa5a43ad82ccfc0e12882330301d269ada003e7d076f9fb7681ad"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674184 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"2653541d8a0fac899e494dbb6ad02c37bcf960a1555e528b0a2558d4cdbd1b00"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.674197 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"3deee2e05e350b74cc8df647f0e9ebe8eff851d1da3855bd525fb218cb39db5d"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.676751 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"6b621db07f5e14fb5cbe73905952022a8324f0758608c5472580edbec2002d06"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.676815 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.676828 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"222cadb559917f7c28e0017ad1fb6002ed986c79d50ef30f5599cc6a644dd789"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.713770 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.713808 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.713819 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.713835 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.713847 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.729121 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podStartSLOduration=40.729091838 podStartE2EDuration="40.729091838s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:07.728276689 +0000 UTC m=+96.187704144" watchObservedRunningTime="2026-03-18 13:04:07.729091838 +0000 UTC m=+96.188519273" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.817868 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.818353 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.818364 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.818379 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.818390 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.920654 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.920702 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.920715 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.920731 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:07 crc kubenswrapper[4912]: I0318 13:04:07.920746 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:07Z","lastTransitionTime":"2026-03-18T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.023415 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.023459 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.023472 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.023489 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.023501 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.127235 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.127277 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.127285 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.127301 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.127310 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.229811 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.229859 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.229870 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.229886 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.229900 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.332946 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.333018 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.333032 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.333079 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.333100 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.435945 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.435983 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.435992 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.436007 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.436017 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.538293 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.538333 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.538345 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.538358 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.538368 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.642001 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.642580 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.642590 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.642612 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.642623 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.683657 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="2e60667c7791f53fe09ded0a63b5df10d248ba76c26a074bf4944e55185404dd" exitCode=0 Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.683764 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"2e60667c7791f53fe09ded0a63b5df10d248ba76c26a074bf4944e55185404dd"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.714278 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.714356 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.714379 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.714407 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.714435 4912 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T13:04:08Z","lastTransitionTime":"2026-03-18T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.770496 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk"] Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.770927 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.780889 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.782591 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.782710 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.787834 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914378 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:08 crc kubenswrapper[4912]: E0318 13:04:08.914550 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:16.914521979 +0000 UTC m=+105.373949404 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914602 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914692 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3c10b87-4662-4754-98f1-bccce93024e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914728 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914757 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914781 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c10b87-4662-4754-98f1-bccce93024e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914824 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:08 crc kubenswrapper[4912]: I0318 13:04:08.914867 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3c10b87-4662-4754-98f1-bccce93024e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:08 crc kubenswrapper[4912]: E0318 13:04:08.914970 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:08 crc kubenswrapper[4912]: E0318 13:04:08.914997 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:16.91499124 +0000 UTC m=+105.374418665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:08 crc kubenswrapper[4912]: E0318 13:04:08.915374 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:08 crc kubenswrapper[4912]: E0318 13:04:08.915451 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:16.915431661 +0000 UTC m=+105.374859076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.015734 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3c10b87-4662-4754-98f1-bccce93024e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016091 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016111 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016138 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3c10b87-4662-4754-98f1-bccce93024e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016165 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016179 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c10b87-4662-4754-98f1-bccce93024e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016197 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016319 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016334 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016344 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016388 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:17.016374969 +0000 UTC m=+105.475802394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016438 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016447 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016453 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.016480 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:17.016471032 +0000 UTC m=+105.475898457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016504 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.016758 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3c10b87-4662-4754-98f1-bccce93024e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.017202 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3c10b87-4662-4754-98f1-bccce93024e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.025361 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c10b87-4662-4754-98f1-bccce93024e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.035347 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3c10b87-4662-4754-98f1-bccce93024e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl6vk\" (UID: \"e3c10b87-4662-4754-98f1-bccce93024e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.103464 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" Mar 18 13:04:09 crc kubenswrapper[4912]: W0318 13:04:09.118290 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c10b87_4662_4754_98f1_bccce93024e5.slice/crio-fca842ce899ec19a6215be84365baa5b7f022f418a8cb299b20902f797bf45a1 WatchSource:0}: Error finding container fca842ce899ec19a6215be84365baa5b7f022f418a8cb299b20902f797bf45a1: Status 404 returned error can't find the container with id fca842ce899ec19a6215be84365baa5b7f022f418a8cb299b20902f797bf45a1 Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.225138 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.227361 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.227449 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.227471 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.227446 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.227575 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.227929 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.228022 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.228213 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.238914 4912 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.420985 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.421220 4912 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: E0318 13:04:09.421299 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs podName:1a9fc2ce-3a71-465b-823d-5b1af71d635c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:13.421278242 +0000 UTC m=+101.880705667 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs") pod "network-metrics-daemon-q4ppq" (UID: "1a9fc2ce-3a71-465b-823d-5b1af71d635c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.688924 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" event={"ID":"e3c10b87-4662-4754-98f1-bccce93024e5","Type":"ContainerStarted","Data":"ed5e046a6a21c4fb07659e85c66d91ed12bf145ef4be784a5872614df8aa6ee6"} Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.688980 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" event={"ID":"e3c10b87-4662-4754-98f1-bccce93024e5","Type":"ContainerStarted","Data":"fca842ce899ec19a6215be84365baa5b7f022f418a8cb299b20902f797bf45a1"} Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.698030 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"f7ac30531ed71ae70a733ca813143eb11ce982a4f0afdd3339e105e499b95193"} Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.724185 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="9f4706b6bc3b8e10cc6f4631de5fd3011a09bd5349b4a957df07281cbb955063" exitCode=0 Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.724279 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"9f4706b6bc3b8e10cc6f4631de5fd3011a09bd5349b4a957df07281cbb955063"} Mar 18 13:04:09 crc kubenswrapper[4912]: I0318 13:04:09.759571 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl6vk" podStartSLOduration=42.7595373 podStartE2EDuration="42.7595373s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:09.713425681 +0000 UTC m=+98.172853146" watchObservedRunningTime="2026-03-18 13:04:09.7595373 +0000 UTC m=+98.218964765" Mar 18 13:04:10 crc kubenswrapper[4912]: I0318 13:04:10.243000 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 13:04:10 crc kubenswrapper[4912]: I0318 13:04:10.731635 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="fd54ee316233543c7d690272b051f7dc9eb412bb4c4dc9248929fd2a8c2df87b" exitCode=0 Mar 18 13:04:10 crc kubenswrapper[4912]: I0318 13:04:10.731719 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"fd54ee316233543c7d690272b051f7dc9eb412bb4c4dc9248929fd2a8c2df87b"} Mar 18 13:04:10 crc kubenswrapper[4912]: I0318 13:04:10.765165 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.765142795 podStartE2EDuration="765.142795ms" podCreationTimestamp="2026-03-18 13:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:10.76370436 +0000 UTC m=+99.223131795" watchObservedRunningTime="2026-03-18 13:04:10.765142795 +0000 UTC m=+99.224570260" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.227035 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.227112 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.227193 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.227288 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:11 crc kubenswrapper[4912]: E0318 13:04:11.227324 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:11 crc kubenswrapper[4912]: E0318 13:04:11.227437 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:11 crc kubenswrapper[4912]: E0318 13:04:11.227530 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:11 crc kubenswrapper[4912]: E0318 13:04:11.227594 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.745443 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerStarted","Data":"610141d1d9fc234ba7dc855038066b3f0904a0b83247a97f972c30f9f094df37"} Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.745894 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.745917 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.745930 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.752206 4912 generic.go:334] "Generic (PLEG): container finished" podID="7799bd07-fe62-47e1-b738-e097930474f1" containerID="427f8391c0b6d64d5b526e23db6bf6c676b9ad6609c7afa0a14287eab5f0bdfd" exitCode=0 Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.752280 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerDied","Data":"427f8391c0b6d64d5b526e23db6bf6c676b9ad6609c7afa0a14287eab5f0bdfd"} Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.775508 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podStartSLOduration=44.775454003 podStartE2EDuration="44.775454003s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:11.771727874 +0000 UTC m=+100.231155319" watchObservedRunningTime="2026-03-18 13:04:11.775454003 +0000 UTC m=+100.234881428" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.784496 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:11 crc kubenswrapper[4912]: I0318 13:04:11.789005 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:12 crc kubenswrapper[4912]: I0318 13:04:12.763219 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" event={"ID":"7799bd07-fe62-47e1-b738-e097930474f1","Type":"ContainerStarted","Data":"2cf99ac83b5a27cfa2e44b6e0075f6e51b7eb942b5c7f24ebce4c81b9a9a6eea"} Mar 18 13:04:12 crc kubenswrapper[4912]: I0318 13:04:12.791746 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4m6dz" podStartSLOduration=45.791723364 podStartE2EDuration="45.791723364s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:12.791313234 +0000 UTC m=+101.250740699" watchObservedRunningTime="2026-03-18 13:04:12.791723364 +0000 UTC m=+101.251150789" Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.227440 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.227594 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.227635 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.227664 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.227754 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.227440 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.227879 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.228009 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.471178 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.471358 4912 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.471428 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs podName:1a9fc2ce-3a71-465b-823d-5b1af71d635c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:21.471407397 +0000 UTC m=+109.930834822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs") pod "network-metrics-daemon-q4ppq" (UID: "1a9fc2ce-3a71-465b-823d-5b1af71d635c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.934078 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q4ppq"] Mar 18 13:04:13 crc kubenswrapper[4912]: I0318 13:04:13.934289 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:13 crc kubenswrapper[4912]: E0318 13:04:13.934432 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:15 crc kubenswrapper[4912]: I0318 13:04:15.227854 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:15 crc kubenswrapper[4912]: I0318 13:04:15.228025 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:15 crc kubenswrapper[4912]: E0318 13:04:15.228162 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 13:04:15 crc kubenswrapper[4912]: I0318 13:04:15.228084 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:15 crc kubenswrapper[4912]: E0318 13:04:15.228313 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 13:04:15 crc kubenswrapper[4912]: E0318 13:04:15.228391 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.227365 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.227562 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q4ppq" podUID="1a9fc2ce-3a71-465b-823d-5b1af71d635c" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.913415 4912 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.914128 4912 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.949627 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.949769 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.949804 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.949909 4912 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.949952 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:32.949939349 +0000 UTC m=+121.409366764 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.950312 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:32.950299588 +0000 UTC m=+121.409727013 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.950356 4912 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:16 crc kubenswrapper[4912]: E0318 13:04:16.950383 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:32.950376559 +0000 UTC m=+121.409803984 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.955016 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.955709 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.955715 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.956613 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967114 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967162 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967372 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967376 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967597 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.967703 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.968449 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.968467 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.968507 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.969304 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.970795 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.971127 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.975349 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.975696 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.977850 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.977879 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.978225 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.978247 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.983230 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.984753 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.985167 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.985493 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sxhvr"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.986018 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.986986 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989313 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989418 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989429 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989656 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989770 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989787 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.989807 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.991796 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.993147 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.995187 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.998220 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7cw4z"] Mar 18 13:04:16 crc kubenswrapper[4912]: I0318 13:04:16.998782 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.001331 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.001540 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.001729 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.002414 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.007534 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.008262 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.010509 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.011152 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nvlj8"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.011401 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.011932 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.012155 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.012597 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.013272 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.034011 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.035157 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.038815 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tjwtn"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.039020 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.039437 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.039518 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.039606 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.039773 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040001 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040143 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040144 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040488 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040624 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040193 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040822 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040208 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040280 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.040395 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.042524 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.042540 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.042820 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.043139 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.043389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.045058 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.055092 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.055273 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.056764 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.057933 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.058346 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.058536 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.061134 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.061825 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.062761 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069562 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8gc9\" (UniqueName: \"kubernetes.io/projected/5661de32-ecd9-4450-b757-465370105082-kube-api-access-m8gc9\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069628 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069666 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069687 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-config\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069701 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk946\" (UniqueName: \"kubernetes.io/projected/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-kube-api-access-nk946\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069724 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069752 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-images\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069770 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5661de32-ecd9-4450-b757-465370105082-serving-cert\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.069920 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.069935 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.069946 4912 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.069990 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:33.069976447 +0000 UTC m=+121.529403862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.069914 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-config\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.070342 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-service-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.070365 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.070433 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.070443 4912 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.070450 4912 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:17 crc kubenswrapper[4912]: E0318 13:04:17.070473 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:33.070465449 +0000 UTC m=+121.529892874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.072263 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.072626 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-bbvtw"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.072847 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sn298"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.077635 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.078323 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.078329 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.081266 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.081827 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082121 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082286 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082419 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082547 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082669 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082777 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.082885 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.083658 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.085996 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086086 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086489 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086504 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086591 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086714 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.086914 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087030 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087075 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087177 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087209 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087271 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087331 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087384 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087415 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087479 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087523 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087582 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087620 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087280 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087786 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087853 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087957 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.087979 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.089801 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.089930 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.093753 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.096218 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.096374 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.096871 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.097085 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.097141 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.097571 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.097863 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.098232 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.098532 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.098859 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.107801 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.108286 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.126157 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.135792 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.136372 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nv7qp"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.137731 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.138284 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.143198 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.147247 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.150879 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.154611 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gh7ht"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.154910 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.155022 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdfkl"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.154933 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.155438 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.155759 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.155932 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.156552 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.159407 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.161144 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.170095 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.171027 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.171841 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.174088 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.177632 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.178422 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.179891 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2ghlz"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.180340 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.180472 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.180627 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.180736 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.182143 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.185770 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.185963 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.186706 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.190749 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.191556 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.191774 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.193184 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.193744 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t75hw"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.193898 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.194500 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.195111 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.195472 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.195572 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.199292 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.201109 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sxhvr"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.202090 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7cw4z"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.203007 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ksmgs"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.203997 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.204187 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.205774 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nvlj8"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.208295 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.210952 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.212632 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.213546 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.216579 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.218237 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.218533 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.220117 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.220831 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.221879 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nvzrd"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.222785 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.223515 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9b88n"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.228799 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.230306 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.230619 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.231420 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.232195 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.237901 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.242113 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.244095 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.245968 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2ghlz"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.247398 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.248289 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.249658 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.250757 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.252621 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sn298"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.253962 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.255575 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tjwtn"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.256773 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ksmgs"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.257746 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.258138 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.259242 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.260629 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdfkl"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.261899 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.263791 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gh7ht"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.264859 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.266067 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nv7qp"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.267291 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.270303 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t75hw"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.271913 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.273382 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.274197 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4r8tg"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.275832 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mncph"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.276239 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.276505 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mncph"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.276657 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.277514 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4r8tg"] Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.278932 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.298599 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.317990 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.338573 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.358587 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.377885 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.397846 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.418540 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.438619 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.458114 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.478368 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.498433 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.520625 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.538478 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.559872 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.578031 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.597663 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.618641 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.638859 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.658387 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.678280 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.698600 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.718429 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.738148 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.758227 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.778977 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.799790 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.818497 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.840703 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.860790 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.878133 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.900130 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.919086 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.958682 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:04:17 crc kubenswrapper[4912]: I0318 13:04:17.979493 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.000925 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.019649 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.039556 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.058923 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.077860 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.100564 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.119319 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.140374 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.159461 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.178129 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.196822 4912 request.go:700] Waited for 1.009392176s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/configmaps?fieldSelector=metadata.name%3Dtrusted-ca-bundle&limit=500&resourceVersion=0 Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.198951 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.219427 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.226985 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.239318 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.260085 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.279517 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.299337 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.322703 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.338961 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.358711 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.377804 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.399347 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.419263 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.439549 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.458854 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.479452 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.497803 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.518422 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.538095 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.558900 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.579262 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.599696 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.619555 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.639107 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.659662 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.678979 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.699125 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.720108 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.750149 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.759482 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.779489 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.799524 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.819436 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.838561 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.860101 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.879661 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.900094 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.919564 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.939231 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.959576 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 13:04:18 crc kubenswrapper[4912]: I0318 13:04:18.979649 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:18.999938 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.018793 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.039590 4912 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.059282 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.080217 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.099889 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.119598 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.139310 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.179150 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 13:04:19 crc kubenswrapper[4912]: I0318 13:04:19.199188 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 13:04:22 crc kubenswrapper[4912]: I0318 13:04:22.893652 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:04:29 crc kubenswrapper[4912]: I0318 13:04:29.246519 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 13:04:35 crc kubenswrapper[4912]: I0318 13:04:35.562022 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:04:35 crc kubenswrapper[4912]: I0318 13:04:35.590984 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=6.590950263 podStartE2EDuration="6.590950263s" podCreationTimestamp="2026-03-18 13:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:32.258974217 +0000 UTC m=+120.718401682" watchObservedRunningTime="2026-03-18 13:04:35.590950263 +0000 UTC m=+124.050377728" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.250865 4912 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.283837 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.284708 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-config\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.284979 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:05:09.284757125 +0000 UTC m=+157.744184540 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285108 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-service-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285299 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285507 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwhl\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285595 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285758 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285875 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.285965 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286015 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-images\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286137 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286209 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286303 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286383 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286497 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286566 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8gc9\" (UniqueName: \"kubernetes.io/projected/5661de32-ecd9-4450-b757-465370105082-kube-api-access-m8gc9\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.286579 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:37.786563479 +0000 UTC m=+126.245991154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286696 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286753 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286814 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-config\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286857 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk946\" (UniqueName: \"kubernetes.io/projected/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-kube-api-access-nk946\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286893 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.286974 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5661de32-ecd9-4450-b757-465370105082-serving-cert\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.287179 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.290792 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.291083 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.291244 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.291389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.291883 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.291939 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.292179 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.292181 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.292213 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.296116 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-config\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.297073 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-images\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.297132 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-service-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.297272 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.297873 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-config\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.301272 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.302328 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.303157 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.304423 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.305428 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5661de32-ecd9-4450-b757-465370105082-serving-cert\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.305431 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9fc2ce-3a71-465b-823d-5b1af71d635c-metrics-certs\") pod \"network-metrics-daemon-q4ppq\" (UID: \"1a9fc2ce-3a71-465b-823d-5b1af71d635c\") " pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.306306 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.306666 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5661de32-ecd9-4450-b757-465370105082-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.308159 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.311518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.313214 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.317023 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.318257 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.330621 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8gc9\" (UniqueName: \"kubernetes.io/projected/5661de32-ecd9-4450-b757-465370105082-kube-api-access-m8gc9\") pod \"authentication-operator-69f744f599-sxhvr\" (UID: \"5661de32-ecd9-4450-b757-465370105082\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.330694 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk946\" (UniqueName: \"kubernetes.io/projected/ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e-kube-api-access-nk946\") pod \"machine-api-operator-5694c8668f-7cw4z\" (UID: \"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.388212 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.388367 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:37.888332997 +0000 UTC m=+126.347760422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.389469 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.389740 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnkn\" (UniqueName: \"kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.389863 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.389944 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.389956 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-audit\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390078 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c330cee-6841-4810-8701-53c782ee170b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390098 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390114 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390163 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-image-import-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390177 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qr4s\" (UniqueName: \"kubernetes.io/projected/65ef2d7f-a45a-4787-a4e6-441dee567ed0-kube-api-access-4qr4s\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390223 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-policies\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390242 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390257 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390299 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b7b174-b61f-4835-abf2-f90c11167250-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390318 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b174-b61f-4835-abf2-f90c11167250-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390333 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-csi-data-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390373 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-metrics-tls\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390388 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgnx\" (UniqueName: \"kubernetes.io/projected/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-kube-api-access-pqgnx\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390427 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390446 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.390462 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2853e2e7-ddce-4741-bb9e-0364ffef8e30-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391189 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrgj\" (UniqueName: \"kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391278 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-serving-cert\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391352 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7feb8268-723e-408b-b800-744481779d38-service-ca-bundle\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391401 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391430 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/af95cab8-055a-4971-acfb-75a0dfe1e394-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391470 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-apiservice-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391485 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvcj\" (UniqueName: \"kubernetes.io/projected/f9b36bfe-8b4f-4dc7-81df-783180e97a44-kube-api-access-hqvcj\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391501 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-serving-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391524 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a729c6f8-e561-4f43-8fb9-48834e1873f2-serving-cert\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391538 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2524b\" (UniqueName: \"kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391553 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391569 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391585 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391600 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391613 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-node-pullsecrets\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391629 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-plugins-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391643 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391659 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-tmpfs\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391754 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-serving-cert\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391875 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.391900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rb6\" (UniqueName: \"kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392000 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392409 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392480 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsbw2\" (UniqueName: \"kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392512 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcc5e62-992c-4554-8296-721247078f5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392566 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-config-volume\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392595 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-audit-dir\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392672 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b36bfe-8b4f-4dc7-81df-783180e97a44-proxy-tls\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392704 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-encryption-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392723 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392741 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5scdh\" (UniqueName: \"kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392769 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392789 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlgz9\" (UniqueName: \"kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392914 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-service-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.392971 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-webhook-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393000 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b210beca-0aed-404b-9af5-b704345ce2f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393080 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-trusted-ca\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393198 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-cabundle\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393291 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393387 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-metrics-certs\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393475 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393512 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-proxy-tls\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwhl\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393608 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmx8x\" (UniqueName: \"kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x\") pod \"migrator-59844c95c7-9vx6l\" (UID: \"542111b4-ff3a-41fc-a963-0b55c1ace3e9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393891 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393908 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393956 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393978 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28kt\" (UniqueName: \"kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.393995 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7qvb\" (UniqueName: \"kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394030 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394068 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svh96\" (UniqueName: \"kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394086 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394107 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-config\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394136 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394151 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394165 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjf9\" (UniqueName: \"kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394200 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2853e2e7-ddce-4741-bb9e-0364ffef8e30-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394223 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394244 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-node-bootstrap-token\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394271 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394291 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394308 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394324 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-mountpoint-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394343 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8vs\" (UniqueName: \"kubernetes.io/projected/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-kube-api-access-6g8vs\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394360 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-registration-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394376 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a729c6f8-e561-4f43-8fb9-48834e1873f2-config\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394387 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394391 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-cert\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394419 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394436 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394593 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-images\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394617 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394635 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394660 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b210beca-0aed-404b-9af5-b704345ce2f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394700 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394719 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/de010a28-87e3-4340-87fc-9242ad95647a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394762 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1c4b9717-5ad2-4993-b69c-ad3266d07766-machine-approver-tls\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394856 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1395fca-0450-4427-98ab-c41857892b0a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394892 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394917 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394939 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcc5e62-992c-4554-8296-721247078f5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.394959 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw49\" (UniqueName: \"kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49\") pod \"downloads-7954f5f757-2ghlz\" (UID: \"1b34ff88-74eb-45ce-acd4-3b7b272e1747\") " pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.394995 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:37.894982927 +0000 UTC m=+126.354410352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395012 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395072 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-serving-cert\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395107 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395141 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-stats-auth\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395174 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-certs\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395210 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-serving-cert\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395250 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395274 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-socket-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395298 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dcc5e62-992c-4554-8296-721247078f5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395319 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395341 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmxmj\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-kube-api-access-mmxmj\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395363 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395429 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl75w\" (UniqueName: \"kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395466 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n6bd\" (UniqueName: \"kubernetes.io/projected/de010a28-87e3-4340-87fc-9242ad95647a-kube-api-access-8n6bd\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395511 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395537 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395558 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395578 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1395fca-0450-4427-98ab-c41857892b0a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395593 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xbg\" (UniqueName: \"kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395609 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2f03ae-9287-4840-bcda-91d0b68849d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395625 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-default-certificate\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395641 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.395664 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-srv-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397013 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c71dac9-3812-4931-a1ef-0f0796ed93c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397711 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jrv\" (UniqueName: \"kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397847 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c71dac9-3812-4931-a1ef-0f0796ed93c9-config\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397876 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-srv-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397895 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vm6\" (UniqueName: \"kubernetes.io/projected/af95cab8-055a-4971-acfb-75a0dfe1e394-kube-api-access-k7vm6\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397911 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-client\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397927 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96x7l\" (UniqueName: \"kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397959 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397976 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.397995 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2zvk\" (UniqueName: \"kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398012 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphh7\" (UniqueName: \"kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398029 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-key\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398060 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398076 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dd6\" (UniqueName: \"kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398105 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6cs\" (UniqueName: \"kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398134 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc33f6a-4371-412a-9e69-52c81be07685-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398152 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398169 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c489444-b3c7-4ec6-a959-6dcdb2b83660-metrics-tls\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398195 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398219 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398248 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshj7\" (UniqueName: \"kubernetes.io/projected/b2ce32fd-55a5-494b-be29-50eb0382b515-kube-api-access-vshj7\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398284 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-encryption-config\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398307 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-dir\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398329 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r7zq\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398351 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-profile-collector-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398373 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398396 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398418 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-config\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398441 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-client\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398466 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzrh\" (UniqueName: \"kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398492 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398517 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398556 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9b36bfe-8b4f-4dc7-81df-783180e97a44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398606 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-client\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398629 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398653 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc33f6a-4371-412a-9e69-52c81be07685-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398676 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-auth-proxy-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398702 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gh55\" (UniqueName: \"kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.398726 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5w6\" (UniqueName: \"kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.404364 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.407439 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.407476 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.408468 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.409788 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwhl\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.418858 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.450629 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.459052 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q4ppq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.492914 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.499882 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500315 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500648 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/af95cab8-055a-4971-acfb-75a0dfe1e394-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500693 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-apiservice-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500731 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvcj\" (UniqueName: \"kubernetes.io/projected/f9b36bfe-8b4f-4dc7-81df-783180e97a44-kube-api-access-hqvcj\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500769 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-serving-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500835 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a729c6f8-e561-4f43-8fb9-48834e1873f2-serving-cert\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500869 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2524b\" (UniqueName: \"kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500905 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500939 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.500970 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501004 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501066 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-node-pullsecrets\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501101 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-plugins-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501166 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501208 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-tmpfs\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501241 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-serving-cert\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501271 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501304 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rb6\" (UniqueName: \"kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501345 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501386 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbw2\" (UniqueName: \"kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501420 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcc5e62-992c-4554-8296-721247078f5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501454 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-config-volume\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501486 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-audit-dir\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501517 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b36bfe-8b4f-4dc7-81df-783180e97a44-proxy-tls\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501550 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-encryption-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501578 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501609 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5scdh\" (UniqueName: \"kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501642 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501680 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlgz9\" (UniqueName: \"kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501725 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-service-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501764 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-webhook-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501798 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b210beca-0aed-404b-9af5-b704345ce2f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501833 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-trusted-ca\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501870 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-cabundle\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.501918 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.00189107 +0000 UTC m=+126.461318485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.501975 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-metrics-certs\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502016 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502056 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-proxy-tls\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502080 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmx8x\" (UniqueName: \"kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x\") pod \"migrator-59844c95c7-9vx6l\" (UID: \"542111b4-ff3a-41fc-a963-0b55c1ace3e9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502129 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502147 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28kt\" (UniqueName: \"kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502169 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qvb\" (UniqueName: \"kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502189 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502209 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svh96\" (UniqueName: \"kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502226 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502243 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502260 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-config\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502277 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502295 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502313 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjf9\" (UniqueName: \"kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502337 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2853e2e7-ddce-4741-bb9e-0364ffef8e30-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502356 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502372 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-node-bootstrap-token\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502370 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-node-pullsecrets\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502396 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502416 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502456 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502477 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-mountpoint-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502497 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8vs\" (UniqueName: \"kubernetes.io/projected/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-kube-api-access-6g8vs\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502516 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-registration-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502533 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a729c6f8-e561-4f43-8fb9-48834e1873f2-config\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502548 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-cert\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502607 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-images\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502624 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502641 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502659 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b210beca-0aed-404b-9af5-b704345ce2f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502694 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502723 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/de010a28-87e3-4340-87fc-9242ad95647a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502722 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-plugins-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502748 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1c4b9717-5ad2-4993-b69c-ad3266d07766-machine-approver-tls\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502768 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1395fca-0450-4427-98ab-c41857892b0a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502786 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502802 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcc5e62-992c-4554-8296-721247078f5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw49\" (UniqueName: \"kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49\") pod \"downloads-7954f5f757-2ghlz\" (UID: \"1b34ff88-74eb-45ce-acd4-3b7b272e1747\") " pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502853 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502869 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-serving-cert\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502889 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502906 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-stats-auth\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502922 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-certs\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502938 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-serving-cert\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502944 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff22e507-73a7-44b1-9eab-c704fb998092-audit-dir\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502957 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502975 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-socket-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502993 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dcc5e62-992c-4554-8296-721247078f5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503012 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503050 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmxmj\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-kube-api-access-mmxmj\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503069 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503084 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl75w\" (UniqueName: \"kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503099 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n6bd\" (UniqueName: \"kubernetes.io/projected/de010a28-87e3-4340-87fc-9242ad95647a-kube-api-access-8n6bd\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503119 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503135 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503156 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1395fca-0450-4427-98ab-c41857892b0a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503172 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xbg\" (UniqueName: \"kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503193 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2f03ae-9287-4840-bcda-91d0b68849d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503202 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-tmpfs\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503212 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-default-certificate\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503255 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503277 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-srv-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503300 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c71dac9-3812-4931-a1ef-0f0796ed93c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503320 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jrv\" (UniqueName: \"kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503344 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c71dac9-3812-4931-a1ef-0f0796ed93c9-config\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503362 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-srv-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503385 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vm6\" (UniqueName: \"kubernetes.io/projected/af95cab8-055a-4971-acfb-75a0dfe1e394-kube-api-access-k7vm6\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503405 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-client\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503415 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503423 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.503954 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b210beca-0aed-404b-9af5-b704345ce2f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.502902 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504238 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504765 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x7l\" (UniqueName: \"kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504791 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504850 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504870 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2zvk\" (UniqueName: \"kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504889 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphh7\" (UniqueName: \"kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504912 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-key\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504930 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504951 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dd6\" (UniqueName: \"kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504975 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6cs\" (UniqueName: \"kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.504997 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc33f6a-4371-412a-9e69-52c81be07685-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505019 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c489444-b3c7-4ec6-a959-6dcdb2b83660-metrics-tls\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505092 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505110 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505129 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshj7\" (UniqueName: \"kubernetes.io/projected/b2ce32fd-55a5-494b-be29-50eb0382b515-kube-api-access-vshj7\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505154 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-encryption-config\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505176 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-dir\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505197 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r7zq\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505217 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-profile-collector-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505238 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505255 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505272 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-config\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505290 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-client\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505314 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzrh\" (UniqueName: \"kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505335 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.505593 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-socket-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.506184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-dir\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.507460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.507674 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-mountpoint-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.508427 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-registration-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.509713 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.009692837 +0000 UTC m=+126.469120272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.512797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2853e2e7-ddce-4741-bb9e-0364ffef8e30-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521127 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9b36bfe-8b4f-4dc7-81df-783180e97a44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521218 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521248 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-client\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521286 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc33f6a-4371-412a-9e69-52c81be07685-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521349 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-auth-proxy-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521380 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gh55\" (UniqueName: \"kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521408 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5w6\" (UniqueName: \"kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521438 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnkn\" (UniqueName: \"kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521462 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521491 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-audit\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521517 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c330cee-6841-4810-8701-53c782ee170b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521535 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521555 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521571 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-image-import-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521589 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qr4s\" (UniqueName: \"kubernetes.io/projected/65ef2d7f-a45a-4787-a4e6-441dee567ed0-kube-api-access-4qr4s\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521606 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-policies\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521622 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521641 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521665 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b7b174-b61f-4835-abf2-f90c11167250-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521681 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b174-b61f-4835-abf2-f90c11167250-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521699 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-csi-data-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521719 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-metrics-tls\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521735 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgnx\" (UniqueName: \"kubernetes.io/projected/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-kube-api-access-pqgnx\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521754 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521776 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2853e2e7-ddce-4741-bb9e-0364ffef8e30-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521794 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrgj\" (UniqueName: \"kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521813 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-serving-cert\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.521831 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7feb8268-723e-408b-b800-744481779d38-service-ca-bundle\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.522491 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3abcfc85-e792-4ba8-a6c2-db7130b1f423-csi-data-dir\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.522723 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9b36bfe-8b4f-4dc7-81df-783180e97a44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.560731 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.561673 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.563984 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.591580 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592313 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592354 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592462 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592682 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592728 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.592853 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593001 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593109 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593157 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593242 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593289 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593305 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593357 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593398 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593477 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593566 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593596 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593661 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593684 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593774 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593823 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593891 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593950 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593959 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594214 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594262 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594346 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594431 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594491 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594528 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594631 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594742 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.594869 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.595025 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.595079 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.595216 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593774 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.593286 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.598015 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.598662 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-config-volume\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.599370 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-serving-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.599414 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.599751 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.600126 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.600187 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.600561 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a729c6f8-e561-4f43-8fb9-48834e1873f2-serving-cert\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.601662 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshj7\" (UniqueName: \"kubernetes.io/projected/b2ce32fd-55a5-494b-be29-50eb0382b515-kube-api-access-vshj7\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.602622 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-config\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.603333 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc33f6a-4371-412a-9e69-52c81be07685-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.603475 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.603993 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-apiservice-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.606628 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.608204 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-key\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.608800 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.608961 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.609484 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-service-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.614197 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.614523 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.614702 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.614908 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.615143 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.615292 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.616930 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.617518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.618495 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.618641 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.620301 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcc5e62-992c-4554-8296-721247078f5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.621111 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.622116 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-config\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.622317 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.622414 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.622550 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a729c6f8-e561-4f43-8fb9-48834e1873f2-config\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.622855 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.623590 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.624226 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.625746 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.626899 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.627289 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.627857 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.628106 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.628698 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-signing-cabundle\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.628762 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vm6\" (UniqueName: \"kubernetes.io/projected/af95cab8-055a-4971-acfb-75a0dfe1e394-kube-api-access-k7vm6\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.629195 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.629512 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.629634 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c489444-b3c7-4ec6-a959-6dcdb2b83660-metrics-tls\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.629939 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n6bd\" (UniqueName: \"kubernetes.io/projected/de010a28-87e3-4340-87fc-9242ad95647a-kube-api-access-8n6bd\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.629946 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.630276 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.130250878 +0000 UTC m=+126.589678313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.630534 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.630648 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-encryption-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.630678 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.631086 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl75w\" (UniqueName: \"kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w\") pod \"cni-sysctl-allowlist-ds-nvzrd\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.631584 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-default-certificate\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.631585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-proxy-tls\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.631787 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.631976 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.632709 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.632982 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-images\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.633187 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b36bfe-8b4f-4dc7-81df-783180e97a44-proxy-tls\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.633585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-encryption-config\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.633869 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c71dac9-3812-4931-a1ef-0f0796ed93c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.633902 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.634236 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-serving-cert\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.634521 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.13448388 +0000 UTC m=+126.593911305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.636771 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.637454 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.638422 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.641092 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.642391 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-webhook-cert\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.642691 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.643183 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvcj\" (UniqueName: \"kubernetes.io/projected/f9b36bfe-8b4f-4dc7-81df-783180e97a44-kube-api-access-hqvcj\") pod \"machine-config-controller-84d6567774-d7s7p\" (UID: \"f9b36bfe-8b4f-4dc7-81df-783180e97a44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.643618 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.644517 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-client\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.645069 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ef2d7f-a45a-4787-a4e6-441dee567ed0-serving-cert\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.645132 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-trusted-ca\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.645330 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.645906 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-srv-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.647275 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-etcd-client\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.647457 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.647686 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.648086 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.648344 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.648360 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8vs\" (UniqueName: \"kubernetes.io/projected/a8c0fa1a-2ab4-42da-b48f-8e87e8709089-kube-api-access-6g8vs\") pod \"machine-config-operator-74547568cd-kq6ff\" (UID: \"a8c0fa1a-2ab4-42da-b48f-8e87e8709089\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.648357 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-cert\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.648860 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-metrics-certs\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.649124 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1395fca-0450-4427-98ab-c41857892b0a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.649346 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/718af076-f027-4594-8294-53ec36b84f3c-profile-collector-cert\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.649425 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.649504 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/af95cab8-055a-4971-acfb-75a0dfe1e394-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nv7qp\" (UID: \"af95cab8-055a-4971-acfb-75a0dfe1e394\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.649654 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2f03ae-9287-4840-bcda-91d0b68849d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.653178 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.653770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.655579 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.664592 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.668125 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.673582 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b210beca-0aed-404b-9af5-b704345ce2f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.688669 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.691349 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.707157 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.716557 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5037377a-5754-40b3-8ffc-ef8776d54442-srv-cert\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.727405 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.732983 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c71dac9-3812-4931-a1ef-0f0796ed93c9-config\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.735279 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.736237 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.236179477 +0000 UTC m=+126.695606912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.747259 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.748204 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-config\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.768115 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.775312 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-node-bootstrap-token\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.795079 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sxhvr"] Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.807441 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.816363 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65ef2d7f-a45a-4787-a4e6-441dee567ed0-etcd-ca\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.827991 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.830693 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.832478 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.838206 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.838794 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.338766125 +0000 UTC m=+126.798193550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: W0318 13:04:37.844181 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-56417cfcc6701c030e52eb8672a94972edbc8f31f29e6a96eefc4f5e8c3c13fe WatchSource:0}: Error finding container 56417cfcc6701c030e52eb8672a94972edbc8f31f29e6a96eefc4f5e8c3c13fe: Status 404 returned error can't find the container with id 56417cfcc6701c030e52eb8672a94972edbc8f31f29e6a96eefc4f5e8c3c13fe Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.846595 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.861585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1c4b9717-5ad2-4993-b69c-ad3266d07766-machine-approver-tls\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.869363 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.873740 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/de010a28-87e3-4340-87fc-9242ad95647a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb9sv\" (UID: \"de010a28-87e3-4340-87fc-9242ad95647a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:37 crc kubenswrapper[4912]: W0318 13:04:37.881123 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-96853ba70e538e10585b096473c080def9b336573b1d38fcd405f739bb1fdb08 WatchSource:0}: Error finding container 96853ba70e538e10585b096473c080def9b336573b1d38fcd405f739bb1fdb08: Status 404 returned error can't find the container with id 96853ba70e538e10585b096473c080def9b336573b1d38fcd405f739bb1fdb08 Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.883330 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" event={"ID":"d456817a-6755-41c0-bf82-bbb3bf4c35fa","Type":"ContainerStarted","Data":"04ab0faf81bfa3db548e29800172ff12afdf3814ca1b074b07f824e2e0f9c2f1"} Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.884433 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0a9ebf6fd956c74281f92793379697f6083bf79dd53da305b589c025f69de130"} Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.885995 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"56417cfcc6701c030e52eb8672a94972edbc8f31f29e6a96eefc4f5e8c3c13fe"} Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.886533 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.887950 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" event={"ID":"5661de32-ecd9-4450-b757-465370105082","Type":"ContainerStarted","Data":"1e4b8992fddcfe792968d2723cbb13aa5fd2f96310edf23cb570b6e8ba1093be"} Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.900599 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q4ppq"] Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.900958 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7feb8268-723e-408b-b800-744481779d38-stats-auth\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.908320 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.921072 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcc5e62-992c-4554-8296-721247078f5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.928133 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.939132 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.939391 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.439333625 +0000 UTC m=+126.898761050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.940119 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:37 crc kubenswrapper[4912]: E0318 13:04:37.940714 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.440701097 +0000 UTC m=+126.900128522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.942603 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2ce32fd-55a5-494b-be29-50eb0382b515-certs\") pod \"machine-config-server-9b88n\" (UID: \"b2ce32fd-55a5-494b-be29-50eb0382b515\") " pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.957068 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.961972 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1395fca-0450-4427-98ab-c41857892b0a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.968634 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:04:37 crc kubenswrapper[4912]: I0318 13:04:37.977658 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.003799 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmxmj\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-kube-api-access-mmxmj\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.007471 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.013798 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff22e507-73a7-44b1-9eab-c704fb998092-serving-cert\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.041358 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.041470 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.541452592 +0000 UTC m=+127.000880007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.041884 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.042596 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.542568718 +0000 UTC m=+127.001996323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.046915 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.061957 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2853e2e7-ddce-4741-bb9e-0364ffef8e30-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.127901 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.136757 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc33f6a-4371-412a-9e69-52c81be07685-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.144078 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.144220 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.644196904 +0000 UTC m=+127.103624329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.144767 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.145567 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.645552856 +0000 UTC m=+127.104980281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.148423 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.154151 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c4b9717-5ad2-4993-b69c-ad3266d07766-auth-proxy-config\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.168223 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.173737 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.187395 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.194460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-audit\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.207697 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.216904 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c330cee-6841-4810-8701-53c782ee170b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.227494 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.234129 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a2121c-5ff0-4ff6-84de-f1354552a568-audit-policies\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.245844 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.246133 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.746094825 +0000 UTC m=+127.205522250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.246431 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.247276 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.747247253 +0000 UTC m=+127.206674678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.268226 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.273748 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7feb8268-723e-408b-b800-744481779d38-service-ca-bundle\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.288308 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.297936 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.309820 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.315298 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff22e507-73a7-44b1-9eab-c704fb998092-image-import-ca\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.328328 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.333752 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.347368 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.348187 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.848161971 +0000 UTC m=+127.307589396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.348424 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.360515 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.371097 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.384167 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-metrics-tls\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.387319 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.396958 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-etcd-client\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.407836 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.414201 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.427954 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.433963 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.448517 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.449670 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.450358 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:38.950337009 +0000 UTC m=+127.409764434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.467230 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b7b174-b61f-4835-abf2-f90c11167250-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.469885 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.473879 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b7b174-b61f-4835-abf2-f90c11167250-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.504835 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgnx\" (UniqueName: \"kubernetes.io/projected/f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973-kube-api-access-pqgnx\") pod \"dns-default-ksmgs\" (UID: \"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973\") " pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.527153 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.537161 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7cw4z"] Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.540894 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39a2121c-5ff0-4ff6-84de-f1354552a568-serving-cert\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.547194 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.550636 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.550826 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.050798226 +0000 UTC m=+127.510225641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.551006 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.551523 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.051494043 +0000 UTC m=+127.510921468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.551562 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2853e2e7-ddce-4741-bb9e-0364ffef8e30-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jf6sm\" (UID: \"2853e2e7-ddce-4741-bb9e-0364ffef8e30\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.567146 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.588685 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.593108 4912 projected.go:288] Couldn't get configMap openshift-service-ca-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.593122 4912 projected.go:288] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.594248 4912 projected.go:288] Couldn't get configMap openshift-ingress-canary/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.594327 4912 projected.go:288] Couldn't get configMap openshift-kube-storage-version-migrator-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595557 4912 projected.go:288] Couldn't get configMap openshift-operator-lifecycle-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595572 4912 projected.go:288] Couldn't get configMap openshift-controller-manager-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595550 4912 projected.go:288] Couldn't get configMap openshift-operator-lifecycle-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595597 4912 projected.go:288] Couldn't get configMap openshift-operator-lifecycle-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595675 4912 projected.go:288] Couldn't get configMap openshift-operator-lifecycle-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.595712 4912 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.597263 4912 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.599821 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dcc5e62-992c-4554-8296-721247078f5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mp64k\" (UID: \"0dcc5e62-992c-4554-8296-721247078f5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.599921 4912 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.600001 4912 projected.go:288] Couldn't get configMap openshift-ingress-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.600016 4912 projected.go:288] Couldn't get configMap openshift-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.602934 4912 projected.go:288] Couldn't get configMap openshift-kube-controller-manager-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.602975 4912 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.602946 4912 projected.go:288] Couldn't get configMap openshift-apiserver-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.603088 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access podName:9c71dac9-3812-4931-a1ef-0f0796ed93c9 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.103030403 +0000 UTC m=+127.562457838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access") pod "kube-controller-manager-operator-78b949d7b-kfvgw" (UID: "9c71dac9-3812-4931-a1ef-0f0796ed93c9") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.605378 4912 request.go:700] Waited for 1.010088933s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27102 Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.606665 4912 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.606897 4912 projected.go:288] Couldn't get configMap openshift-kube-storage-version-migrator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.607051 4912 projected.go:288] Couldn't get configMap openshift-dns-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.608428 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.615309 4912 projected.go:288] Couldn't get configMap openshift-oauth-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.620941 4912 projected.go:288] Couldn't get configMap openshift-operator-lifecycle-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.623113 4912 projected.go:288] Couldn't get configMap openshift-kube-scheduler-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.623168 4912 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.623195 4912 projected.go:288] Couldn't get configMap openshift-ingress/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.623241 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access podName:a3292cf9-103c-4b5c-8aec-fdb67ca67f1a nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.123220049 +0000 UTC m=+127.582647474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access") pod "openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" (UID: "a3292cf9-103c-4b5c-8aec-fdb67ca67f1a") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.623131 4912 projected.go:288] Couldn't get configMap openshift-service-ca/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.628145 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.650726 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.652066 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.652264 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.152236387 +0000 UTC m=+127.611663812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.652555 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.652957 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.152937574 +0000 UTC m=+127.612365189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.667152 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.687928 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.697853 4912 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.697922 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.705121 4912 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.705339 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.707543 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.727990 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.747665 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.754006 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.754163 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.254140309 +0000 UTC m=+127.713567734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.754467 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.754815 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.254805545 +0000 UTC m=+127.714232970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.766663 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.789868 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.801248 4912 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.801874 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.802421 4912 projected.go:288] Couldn't get configMap openshift-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.810474 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.829479 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.851020 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.855709 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.856543 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.355907167 +0000 UTC m=+127.815334602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.856626 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.857206 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.357178038 +0000 UTC m=+127.816605463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.867644 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.892659 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.898112 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" event={"ID":"d456817a-6755-41c0-bf82-bbb3bf4c35fa","Type":"ContainerStarted","Data":"be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.898972 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.910277 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.912289 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q4ppq" event={"ID":"1a9fc2ce-3a71-465b-823d-5b1af71d635c","Type":"ContainerStarted","Data":"63baba207da7038834c545d50604f0abbc1e986a1f6579781ae6268e2b549864"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.912341 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q4ppq" event={"ID":"1a9fc2ce-3a71-465b-823d-5b1af71d635c","Type":"ContainerStarted","Data":"16b12756876766b3bea808c3752da43e684e32bf85e1a0c15fd7d8434d893207"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.912355 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q4ppq" event={"ID":"1a9fc2ce-3a71-465b-823d-5b1af71d635c","Type":"ContainerStarted","Data":"a8cfea69a93b72ba09bf2224fb53fe7f0dbd2ad47765d78078ca423feb810617"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.913399 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ba5e616ab0de0a6799ac4a456bf948fa4f82c222dcc2fa5803ba4fbbff19bffc"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.913526 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.914831 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"336a2264e56d2205cb7a1cce9e4e2cc35e634185bd9985b1510e3fe893339874"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.914865 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"96853ba70e538e10585b096473c080def9b336573b1d38fcd405f739bb1fdb08"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.916060 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"eeda30aac6d7d09e1ef74898b413241ac018acb7d7fb424ab28fe06c2efcc93c"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.917461 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" event={"ID":"5661de32-ecd9-4450-b757-465370105082","Type":"ContainerStarted","Data":"1a2abf407dbb7a4d45cd9bea2e08dcd690fb6c3f51bcc6ef05aeb8e10c01d4c3"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.919262 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff"] Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.927447 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" event={"ID":"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e","Type":"ContainerStarted","Data":"1211143172e52e89f3a0be2e3ddec718bc67c6dc2435fa4c6b6d9590c9f93e73"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.927491 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" event={"ID":"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e","Type":"ContainerStarted","Data":"c5ac86596dd38339d9408920427a2cac586d23c45a20be51d8a9976d66d183e5"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.927502 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" event={"ID":"ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e","Type":"ContainerStarted","Data":"79163eb512d6b429ced9413ac16fc49f3c1b8edeb8f41180aa876829a0f48f3e"} Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.927709 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.949260 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.956372 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nv7qp"] Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.957902 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:38 crc kubenswrapper[4912]: E0318 13:04:38.958240 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.458201959 +0000 UTC m=+127.917629384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.967168 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 13:04:38 crc kubenswrapper[4912]: W0318 13:04:38.971546 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf95cab8_055a_4971_acfb_75a0dfe1e394.slice/crio-dea2db696c6f9e2496bffc0c820ea6056e776476e6d15fd24599be0925846d81 WatchSource:0}: Error finding container dea2db696c6f9e2496bffc0c820ea6056e776476e6d15fd24599be0925846d81: Status 404 returned error can't find the container with id dea2db696c6f9e2496bffc0c820ea6056e776476e6d15fd24599be0925846d81 Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.987339 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.989743 4912 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 13:04:38 crc kubenswrapper[4912]: I0318 13:04:38.989791 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.007420 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.027138 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.049270 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.060362 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.061187 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.561173656 +0000 UTC m=+128.020601081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.069584 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.079666 4912 projected.go:288] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.088200 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.099899 4912 projected.go:288] Couldn't get configMap hostpath-provisioner/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.108368 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.120194 4912 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.127930 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.144812 4912 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-config-operator/machine-config-server-9b88n" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.144890 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9b88n" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.148472 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.161978 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.162164 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.662132935 +0000 UTC m=+128.121560360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.162567 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.162843 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.162899 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.163414 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.663402576 +0000 UTC m=+128.122830001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.169856 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.172557 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c71dac9-3812-4931-a1ef-0f0796ed93c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kfvgw\" (UID: \"9c71dac9-3812-4931-a1ef-0f0796ed93c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.172598 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3292cf9-103c-4b5c-8aec-fdb67ca67f1a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wgnr\" (UID: \"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.186670 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.202553 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv"] Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.208865 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.209448 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p"] Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.220950 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlgz9\" (UniqueName: \"kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9\") pod \"controller-manager-879f6c89f-pdkfk\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.228266 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.233473 4912 projected.go:194] Error preparing data for projected volume kube-api-access-m28kt for pod openshift-console-operator/console-operator-58897d9998-tjwtn: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.233570 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt podName:55da9bcd-23b6-4ea7-8f43-26c43d05a9e3 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.733537383 +0000 UTC m=+128.192964808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m28kt" (UniqueName: "kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt") pod "console-operator-58897d9998-tjwtn" (UID: "55da9bcd-23b6-4ea7-8f43-26c43d05a9e3") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.247991 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.253285 4912 projected.go:194] Error preparing data for projected volume kube-api-access-2524b for pod openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.253417 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b podName:a729c6f8-e561-4f43-8fb9-48834e1873f2 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.753381431 +0000 UTC m=+128.212808856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2524b" (UniqueName: "kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b") pod "service-ca-operator-777779d784-vm9dz" (UID: "a729c6f8-e561-4f43-8fb9-48834e1873f2") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.264343 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.264856 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.764815956 +0000 UTC m=+128.224243391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.265630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.266243 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.76622873 +0000 UTC m=+128.225656325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.267748 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.274439 4912 projected.go:194] Error preparing data for projected volume kube-api-access-99xbg for pod openshift-ingress-canary/ingress-canary-mncph: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.274529 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg podName:7b2647fd-d28d-4dff-a4ef-e7839fffd33e nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.774504419 +0000 UTC m=+128.233931844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-99xbg" (UniqueName: "kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg") pod "ingress-canary-mncph" (UID: "7b2647fd-d28d-4dff-a4ef-e7839fffd33e") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.286467 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.295421 4912 projected.go:194] Error preparing data for projected volume kube-api-access-svh96 for pod openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.295571 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96 podName:1fc33f6a-4371-412a-9e69-52c81be07685 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.795539485 +0000 UTC m=+128.254966910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-svh96" (UniqueName: "kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96") pod "kube-storage-version-migrator-operator-b67b599dd-24gsx" (UID: "1fc33f6a-4371-412a-9e69-52c81be07685") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.307383 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.311130 4912 projected.go:194] Error preparing data for projected volume kube-api-access-htjvs for pod openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.311261 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs podName:d5f844c0-ca4c-4097-bedd-bbb4323cc717 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.811223742 +0000 UTC m=+128.270651167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-htjvs" (UniqueName: "kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs") pod "collect-profiles-29563980-8kx5f" (UID: "d5f844c0-ca4c-4097-bedd-bbb4323cc717") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.315924 4912 projected.go:194] Error preparing data for projected volume kube-api-access-lsbw2 for pod openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316026 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2 podName:5f2f03ae-9287-4840-bcda-91d0b68849d7 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.816000617 +0000 UTC m=+128.275428042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lsbw2" (UniqueName: "kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2") pod "package-server-manager-789f6589d5-mjtnc" (UID: "5f2f03ae-9287-4840-bcda-91d0b68849d7") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316314 4912 projected.go:194] Error preparing data for projected volume kube-api-access-n5rb6 for pod openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316345 4912 projected.go:194] Error preparing data for projected volume kube-api-access-5scdh for pod openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316377 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6 podName:389bca57-3d65-4ed4-8b0d-9c09c58ecf99 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.816363116 +0000 UTC m=+128.275790541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n5rb6" (UniqueName: "kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6") pod "packageserver-d55dfcdfc-ptbgq" (UID: "389bca57-3d65-4ed4-8b0d-9c09c58ecf99") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316428 4912 projected.go:194] Error preparing data for projected volume kube-api-access-wphh7 for pod openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316449 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh podName:5037377a-5754-40b3-8ffc-ef8776d54442 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.816421927 +0000 UTC m=+128.275849352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5scdh" (UniqueName: "kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh") pod "olm-operator-6b444d44fb-zzbbb" (UID: "5037377a-5754-40b3-8ffc-ef8776d54442") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.316470 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7 podName:718af076-f027-4594-8294-53ec36b84f3c nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.816457248 +0000 UTC m=+128.275884883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wphh7" (UniqueName: "kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7") pod "catalog-operator-68c6474976-c4jkh" (UID: "718af076-f027-4594-8294-53ec36b84f3c") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.333136 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.335908 4912 projected.go:194] Error preparing data for projected volume kube-api-access-l2zvk for pod openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.336017 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk podName:3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.835987818 +0000 UTC m=+128.295415243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l2zvk" (UniqueName: "kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk") pod "openshift-controller-manager-operator-756b6f6bc6-mnzgj" (UID: "3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.348847 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.355864 4912 projected.go:194] Error preparing data for projected volume kube-api-access-m7t5k for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.356020 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k podName:65fd5ca1-a95e-47d1-9e3c-62178a36eab9 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.85598908 +0000 UTC m=+128.315416495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m7t5k" (UniqueName: "kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k") pod "route-controller-manager-6576b87f9c-9c657" (UID: "65fd5ca1-a95e-47d1-9e3c-62178a36eab9") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.367100 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.367408 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.867386574 +0000 UTC m=+128.326813999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.367592 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.368072 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.8680522 +0000 UTC m=+128.327479625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.368697 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.372537 4912 projected.go:194] Error preparing data for projected volume kube-api-access-r5ktm for pod openshift-console/console-f9d7485db-vpn9z: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.372612 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm podName:39c7b2b0-6f20-426b-961d-65878696145f nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.872591029 +0000 UTC m=+128.332018454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r5ktm" (UniqueName: "kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm") pod "console-f9d7485db-vpn9z" (UID: "39c7b2b0-6f20-426b-961d-65878696145f") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.377669 4912 projected.go:194] Error preparing data for projected volume kube-api-access-wbw49 for pod openshift-console/downloads-7954f5f757-2ghlz: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.377733 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49 podName:1b34ff88-74eb-45ce-acd4-3b7b272e1747 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.877721002 +0000 UTC m=+128.337148427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wbw49" (UniqueName: "kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49") pod "downloads-7954f5f757-2ghlz" (UID: "1b34ff88-74eb-45ce-acd4-3b7b272e1747") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.386417 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.390363 4912 projected.go:194] Error preparing data for projected volume kube-api-access-v7qvb for pod openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.390475 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb podName:b210beca-0aed-404b-9af5-b704345ce2f8 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.890445308 +0000 UTC m=+128.349872733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v7qvb" (UniqueName: "kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb") pod "openshift-config-operator-7777fb866f-tqk5x" (UID: "b210beca-0aed-404b-9af5-b704345ce2f8") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.408492 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.411146 4912 projected.go:194] Error preparing data for projected volume kube-api-access-7r7zq for pod openshift-ingress-operator/ingress-operator-5b745b69d9-sn298: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.411235 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq podName:e1395fca-0450-4427-98ab-c41857892b0a nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.911214018 +0000 UTC m=+128.370641433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7r7zq" (UniqueName: "kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq") pod "ingress-operator-5b745b69d9-sn298" (UID: "e1395fca-0450-4427-98ab-c41857892b0a") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.427075 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.436612 4912 projected.go:194] Error preparing data for projected volume kube-api-access-96x7l for pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.436730 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l podName:49b7b174-b61f-4835-abf2-f90c11167250 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.936704232 +0000 UTC m=+128.396131657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-96x7l" (UniqueName: "kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l") pod "openshift-apiserver-operator-796bbdcf4f-xpph2" (UID: "49b7b174-b61f-4835-abf2-f90c11167250") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.448640 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.457241 4912 projected.go:194] Error preparing data for projected volume kube-api-access-fb2sk for pod openshift-authentication/oauth-openshift-558db77b4-q8fqp: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.457375 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk podName:65b97390-afd1-41da-9b38-f3467a213007 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.957341378 +0000 UTC m=+128.416768803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fb2sk" (UniqueName: "kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk") pod "oauth-openshift-558db77b4-q8fqp" (UID: "65b97390-afd1-41da-9b38-f3467a213007") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.466683 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.467504 4912 projected.go:194] Error preparing data for projected volume kube-api-access-tmx8x for pod openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.467611 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x podName:542111b4-ff3a-41fc-a963-0b55c1ace3e9 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.967586185 +0000 UTC m=+128.427013610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tmx8x" (UniqueName: "kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x") pod "migrator-59844c95c7-9vx6l" (UID: "542111b4-ff3a-41fc-a963-0b55c1ace3e9") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.468591 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.468789 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.968762033 +0000 UTC m=+128.428189448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.469233 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.469967 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.969938941 +0000 UTC m=+128.429366366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.486552 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.487452 4912 projected.go:194] Error preparing data for projected volume kube-api-access-7jzrh for pod openshift-dns-operator/dns-operator-744455d44c-nvlj8: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.487539 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh podName:6c489444-b3c7-4ec6-a959-6dcdb2b83660 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:39.987511864 +0000 UTC m=+128.446939289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7jzrh" (UniqueName: "kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh") pod "dns-operator-744455d44c-nvlj8" (UID: "6c489444-b3c7-4ec6-a959-6dcdb2b83660") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.507434 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.515995 4912 projected.go:194] Error preparing data for projected volume kube-api-access-z2dd6 for pod openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.516148 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6 podName:39a2121c-5ff0-4ff6-84de-f1354552a568 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.016116662 +0000 UTC m=+128.475544087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z2dd6" (UniqueName: "kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6") pod "apiserver-7bbb656c7d-qnf86" (UID: "39a2121c-5ff0-4ff6-84de-f1354552a568") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.527774 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.533711 4912 projected.go:194] Error preparing data for projected volume kube-api-access-84jrv for pod openshift-ingress/router-default-5444994796-bbvtw: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.533841 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv podName:7feb8268-723e-408b-b800-744481779d38 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.033811168 +0000 UTC m=+128.493238593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-84jrv" (UniqueName: "kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv") pod "router-default-5444994796-bbvtw" (UID: "7feb8268-723e-408b-b800-744481779d38") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.548669 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.553587 4912 projected.go:194] Error preparing data for projected volume kube-api-access-vp6cs for pod openshift-service-ca/service-ca-9c57cc56f-gh7ht: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.553739 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs podName:6508b29f-a1b9-4a3a-aa9d-312c53ed90b1 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.053710707 +0000 UTC m=+128.513138122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vp6cs" (UniqueName: "kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs") pod "service-ca-9c57cc56f-gh7ht" (UID: "6508b29f-a1b9-4a3a-aa9d-312c53ed90b1") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.567404 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.568286 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.571120 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.571320 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.07129362 +0000 UTC m=+128.530721045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.571561 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.572198 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.072165191 +0000 UTC m=+128.531592616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.587284 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.589144 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.608347 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.608523 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.628507 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.632600 4912 projected.go:194] Error preparing data for projected volume kube-api-access-zdjf9 for pod openshift-apiserver/apiserver-76f77b778f-t75hw: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.632778 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9 podName:ff22e507-73a7-44b1-9eab-c704fb998092 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.132740448 +0000 UTC m=+128.592167873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zdjf9" (UniqueName: "kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9") pod "apiserver-76f77b778f-t75hw" (UID: "ff22e507-73a7-44b1-9eab-c704fb998092") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.667484 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.670241 4912 projected.go:194] Error preparing data for projected volume kube-api-access-2gh55 for pod openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.670400 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55 podName:7c330cee-6841-4810-8701-53c782ee170b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.170357663 +0000 UTC m=+128.629785098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2gh55" (UniqueName: "kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55") pod "cluster-samples-operator-665b6dd947-f7ql4" (UID: "7c330cee-6841-4810-8701-53c782ee170b") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.673391 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.674155 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.174118464 +0000 UTC m=+128.633545899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.676102 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.679417 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.1793759 +0000 UTC m=+128.638803325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.688922 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.694113 4912 projected.go:194] Error preparing data for projected volume kube-api-access-mg5w6 for pod hostpath-provisioner/csi-hostpathplugin-4r8tg: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.694197 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6 podName:3abcfc85-e792-4ba8-a6c2-db7130b1f423 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.194177966 +0000 UTC m=+128.653605391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mg5w6" (UniqueName: "kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6") pod "csi-hostpathplugin-4r8tg" (UID: "3abcfc85-e792-4ba8-a6c2-db7130b1f423") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.711481 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.721749 4912 projected.go:194] Error preparing data for projected volume kube-api-access-hvnkn for pod openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp: failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.721904 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn podName:1c4b9717-5ad2-4993-b69c-ad3266d07766 nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.221868843 +0000 UTC m=+128.681296268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hvnkn" (UniqueName: "kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn") pod "machine-approver-56656f9798-zshjp" (UID: "1c4b9717-5ad2-4993-b69c-ad3266d07766") : failed to sync configmap cache: timed out waiting for the condition Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.727486 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.745846 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qr4s\" (UniqueName: \"kubernetes.io/projected/65ef2d7f-a45a-4787-a4e6-441dee567ed0-kube-api-access-4qr4s\") pod \"etcd-operator-b45778765-hdfkl\" (UID: \"65ef2d7f-a45a-4787-a4e6-441dee567ed0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.747625 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.756512 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrgj\" (UniqueName: \"kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj\") pod \"marketplace-operator-79b997595-jsbwx\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.773547 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.775851 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.780354 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.780715 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2524b\" (UniqueName: \"kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.780794 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28kt\" (UniqueName: \"kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.780889 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xbg\" (UniqueName: \"kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.782135 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.282104542 +0000 UTC m=+128.741531967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.794023 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2524b\" (UniqueName: \"kubernetes.io/projected/a729c6f8-e561-4f43-8fb9-48834e1873f2-kube-api-access-2524b\") pod \"service-ca-operator-777779d784-vm9dz\" (UID: \"a729c6f8-e561-4f43-8fb9-48834e1873f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.794890 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xbg\" (UniqueName: \"kubernetes.io/projected/7b2647fd-d28d-4dff-a4ef-e7839fffd33e-kube-api-access-99xbg\") pod \"ingress-canary-mncph\" (UID: \"7b2647fd-d28d-4dff-a4ef-e7839fffd33e\") " pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.800274 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.802337 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.807207 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.807768 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.819123 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28kt\" (UniqueName: \"kubernetes.io/projected/55da9bcd-23b6-4ea7-8f43-26c43d05a9e3-kube-api-access-m28kt\") pod \"console-operator-58897d9998-tjwtn\" (UID: \"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3\") " pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.843731 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ksmgs"] Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.850833 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.857711 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podStartSLOduration=22.85768421 podStartE2EDuration="22.85768421s" podCreationTimestamp="2026-03-18 13:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:39.854722519 +0000 UTC m=+128.314149954" watchObservedRunningTime="2026-03-18 13:04:39.85768421 +0000 UTC m=+128.317111635" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.860321 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.864173 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k"] Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.871834 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.880384 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882023 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5scdh\" (UniqueName: \"kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882113 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svh96\" (UniqueName: \"kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882159 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882205 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882232 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw49\" (UniqueName: \"kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49\") pod \"downloads-7954f5f757-2ghlz\" (UID: \"1b34ff88-74eb-45ce-acd4-3b7b272e1747\") " pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882286 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2zvk\" (UniqueName: \"kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882307 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphh7\" (UniqueName: \"kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882328 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882434 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rb6\" (UniqueName: \"kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.882462 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbw2\" (UniqueName: \"kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.883302 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.383280376 +0000 UTC m=+128.842707801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.889553 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5scdh\" (UniqueName: \"kubernetes.io/projected/5037377a-5754-40b3-8ffc-ef8776d54442-kube-api-access-5scdh\") pod \"olm-operator-6b444d44fb-zzbbb\" (UID: \"5037377a-5754-40b3-8ffc-ef8776d54442\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.889621 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") pod \"console-f9d7485db-vpn9z\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.890685 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw49\" (UniqueName: \"kubernetes.io/projected/1b34ff88-74eb-45ce-acd4-3b7b272e1747-kube-api-access-wbw49\") pod \"downloads-7954f5f757-2ghlz\" (UID: \"1b34ff88-74eb-45ce-acd4-3b7b272e1747\") " pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.891521 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2zvk\" (UniqueName: \"kubernetes.io/projected/3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a-kube-api-access-l2zvk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnzgj\" (UID: \"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.895188 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svh96\" (UniqueName: \"kubernetes.io/projected/1fc33f6a-4371-412a-9e69-52c81be07685-kube-api-access-svh96\") pod \"kube-storage-version-migrator-operator-b67b599dd-24gsx\" (UID: \"1fc33f6a-4371-412a-9e69-52c81be07685\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.896496 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphh7\" (UniqueName: \"kubernetes.io/projected/718af076-f027-4594-8294-53ec36b84f3c-kube-api-access-wphh7\") pod \"catalog-operator-68c6474976-c4jkh\" (UID: \"718af076-f027-4594-8294-53ec36b84f3c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.896908 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") pod \"collect-profiles-29563980-8kx5f\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.900438 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rb6\" (UniqueName: \"kubernetes.io/projected/389bca57-3d65-4ed4-8b0d-9c09c58ecf99-kube-api-access-n5rb6\") pod \"packageserver-d55dfcdfc-ptbgq\" (UID: \"389bca57-3d65-4ed4-8b0d-9c09c58ecf99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.901021 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") pod \"route-controller-manager-6576b87f9c-9c657\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.907362 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsbw2\" (UniqueName: \"kubernetes.io/projected/5f2f03ae-9287-4840-bcda-91d0b68849d7-kube-api-access-lsbw2\") pod \"package-server-manager-789f6589d5-mjtnc\" (UID: \"5f2f03ae-9287-4840-bcda-91d0b68849d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.933241 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm"] Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.939254 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.941409 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.949535 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.955997 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ksmgs" event={"ID":"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973","Type":"ContainerStarted","Data":"5dcf61a241d6c4420ad98218d1c07f96455abb3582bf4342caab04bb661745a0"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.958577 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.963446 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9b88n" event={"ID":"b2ce32fd-55a5-494b-be29-50eb0382b515","Type":"ContainerStarted","Data":"24ccee3b3b61ff008a967b236bcf4e92fc55bb4d043eec4dd95ad566f4fcfbbf"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.963524 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9b88n" event={"ID":"b2ce32fd-55a5-494b-be29-50eb0382b515","Type":"ContainerStarted","Data":"004ff413486b03f076ce56a18ebbf5fdc66f069d981119d3f4170af3e07de144"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.965789 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" event={"ID":"0dcc5e62-992c-4554-8296-721247078f5b","Type":"ContainerStarted","Data":"483d0bb69e5cd16d39eba72e9ae294d9d8c019ec06821b6c79067b5b8c9b895c"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.972670 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" event={"ID":"af95cab8-055a-4971-acfb-75a0dfe1e394","Type":"ContainerStarted","Data":"1e9078771817f2e1be46d448b932be2fe233f6bffb3a80561c0d79c95371bde2"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.972725 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" event={"ID":"af95cab8-055a-4971-acfb-75a0dfe1e394","Type":"ContainerStarted","Data":"ec3f27e5eb565b8df7cfeae9ca9fdab14881cf92fed92f00650a77eac7e7c6c8"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.972737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" event={"ID":"af95cab8-055a-4971-acfb-75a0dfe1e394","Type":"ContainerStarted","Data":"dea2db696c6f9e2496bffc0c820ea6056e776476e6d15fd24599be0925846d81"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.977625 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" event={"ID":"de010a28-87e3-4340-87fc-9242ad95647a","Type":"ContainerStarted","Data":"5aea7436ddf984da2ac8db19973ed1fc63ea96d87acda9940336aa5acf815cd4"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.977683 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" event={"ID":"de010a28-87e3-4340-87fc-9242ad95647a","Type":"ContainerStarted","Data":"5d754e2bdf1a432b4fe0993bf27780004f590cef8585ccbb1a93c6b7ed9c0607"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983307 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983699 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmx8x\" (UniqueName: \"kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x\") pod \"migrator-59844c95c7-9vx6l\" (UID: \"542111b4-ff3a-41fc-a963-0b55c1ace3e9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983735 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qvb\" (UniqueName: \"kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983789 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983828 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x7l\" (UniqueName: \"kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.983870 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r7zq\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:39 crc kubenswrapper[4912]: E0318 13:04:39.985323 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.48528411 +0000 UTC m=+128.944711545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.995206 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" event={"ID":"f9b36bfe-8b4f-4dc7-81df-783180e97a44","Type":"ContainerStarted","Data":"907e0bd84b67a1267dd61d315606fba32368be3ffb9796f75846b58b6b685c3d"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.995267 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" event={"ID":"f9b36bfe-8b4f-4dc7-81df-783180e97a44","Type":"ContainerStarted","Data":"7f16c4ae9aa847e905f1fa3fb67bb7b04da562ad2c55e24677d983109a2c9762"} Mar 18 13:04:39 crc kubenswrapper[4912]: I0318 13:04:39.995282 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" event={"ID":"f9b36bfe-8b4f-4dc7-81df-783180e97a44","Type":"ContainerStarted","Data":"1bf45b5be33387c60c2ca7260c871e4bcded6f4cfba6761c945924d9d3d6630e"} Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.004749 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" event={"ID":"a8c0fa1a-2ab4-42da-b48f-8e87e8709089","Type":"ContainerStarted","Data":"67683976467ff9ecfd494018cae411861dcc6f5b3092a4dbb7a6141d043a9010"} Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.004815 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" event={"ID":"a8c0fa1a-2ab4-42da-b48f-8e87e8709089","Type":"ContainerStarted","Data":"cbdc8eda09a95cf881c4940ed07dd961ea42a12233bebd854cc7cd9a7398fd3d"} Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.004828 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" event={"ID":"a8c0fa1a-2ab4-42da-b48f-8e87e8709089","Type":"ContainerStarted","Data":"1060052fe2d43a85ccbb3648920b736c40df2b7a004bc0cc2a2ba2f80a95becc"} Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.009307 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" podStartSLOduration=73.009284658 podStartE2EDuration="1m13.009284658s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.009215196 +0000 UTC m=+128.468642631" watchObservedRunningTime="2026-03-18 13:04:40.009284658 +0000 UTC m=+128.468712083" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.010583 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.014509 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mncph" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.015700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") pod \"oauth-openshift-558db77b4-q8fqp\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.015715 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r7zq\" (UniqueName: \"kubernetes.io/projected/e1395fca-0450-4427-98ab-c41857892b0a-kube-api-access-7r7zq\") pod \"ingress-operator-5b745b69d9-sn298\" (UID: \"e1395fca-0450-4427-98ab-c41857892b0a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.016416 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7qvb\" (UniqueName: \"kubernetes.io/projected/b210beca-0aed-404b-9af5-b704345ce2f8-kube-api-access-v7qvb\") pod \"openshift-config-operator-7777fb866f-tqk5x\" (UID: \"b210beca-0aed-404b-9af5-b704345ce2f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.023109 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmx8x\" (UniqueName: \"kubernetes.io/projected/542111b4-ff3a-41fc-a963-0b55c1ace3e9-kube-api-access-tmx8x\") pod \"migrator-59844c95c7-9vx6l\" (UID: \"542111b4-ff3a-41fc-a963-0b55c1ace3e9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.025839 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96x7l\" (UniqueName: \"kubernetes.io/projected/49b7b174-b61f-4835-abf2-f90c11167250-kube-api-access-96x7l\") pod \"openshift-apiserver-operator-796bbdcf4f-xpph2\" (UID: \"49b7b174-b61f-4835-abf2-f90c11167250\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.048826 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.051124 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.067715 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.067921 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.072605 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.088715 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dd6\" (UniqueName: \"kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.088765 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6cs\" (UniqueName: \"kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.088822 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzrh\" (UniqueName: \"kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.089285 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.089422 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jrv\" (UniqueName: \"kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.093328 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.593300189 +0000 UTC m=+129.052727614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.115828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jrv\" (UniqueName: \"kubernetes.io/projected/7feb8268-723e-408b-b800-744481779d38-kube-api-access-84jrv\") pod \"router-default-5444994796-bbvtw\" (UID: \"7feb8268-723e-408b-b800-744481779d38\") " pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.116021 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dd6\" (UniqueName: \"kubernetes.io/projected/39a2121c-5ff0-4ff6-84de-f1354552a568-kube-api-access-z2dd6\") pod \"apiserver-7bbb656c7d-qnf86\" (UID: \"39a2121c-5ff0-4ff6-84de-f1354552a568\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.116179 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.116518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6cs\" (UniqueName: \"kubernetes.io/projected/6508b29f-a1b9-4a3a-aa9d-312c53ed90b1-kube-api-access-vp6cs\") pod \"service-ca-9c57cc56f-gh7ht\" (UID: \"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1\") " pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.127438 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7cw4z" podStartSLOduration=73.1274077 podStartE2EDuration="1m13.1274077s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.1153771 +0000 UTC m=+128.574804535" watchObservedRunningTime="2026-03-18 13:04:40.1274077 +0000 UTC m=+128.586835125" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.128162 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.129737 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.129919 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.133775 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzrh\" (UniqueName: \"kubernetes.io/projected/6c489444-b3c7-4ec6-a959-6dcdb2b83660-kube-api-access-7jzrh\") pod \"dns-operator-744455d44c-nvlj8\" (UID: \"6c489444-b3c7-4ec6-a959-6dcdb2b83660\") " pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.152904 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.154835 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.178227 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.185152 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.190396 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.190664 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjf9\" (UniqueName: \"kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.190748 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gh55\" (UniqueName: \"kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.191328 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.691304517 +0000 UTC m=+129.150731942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.197356 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gh55\" (UniqueName: \"kubernetes.io/projected/7c330cee-6841-4810-8701-53c782ee170b-kube-api-access-2gh55\") pod \"cluster-samples-operator-665b6dd947-f7ql4\" (UID: \"7c330cee-6841-4810-8701-53c782ee170b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.198766 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.201550 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.240663 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.247541 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.248485 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.248903 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.249153 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.253943 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.258494 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.266863 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.267876 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjf9\" (UniqueName: \"kubernetes.io/projected/ff22e507-73a7-44b1-9eab-c704fb998092-kube-api-access-zdjf9\") pod \"apiserver-76f77b778f-t75hw\" (UID: \"ff22e507-73a7-44b1-9eab-c704fb998092\") " pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.276153 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.293455 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.294465 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.294527 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5w6\" (UniqueName: \"kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.294549 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnkn\" (UniqueName: \"kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.299875 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.299950 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.799934871 +0000 UTC m=+129.259362296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.308766 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.317551 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.317952 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnkn\" (UniqueName: \"kubernetes.io/projected/1c4b9717-5ad2-4993-b69c-ad3266d07766-kube-api-access-hvnkn\") pod \"machine-approver-56656f9798-zshjp\" (UID: \"1c4b9717-5ad2-4993-b69c-ad3266d07766\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.320696 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5w6\" (UniqueName: \"kubernetes.io/projected/3abcfc85-e792-4ba8-a6c2-db7130b1f423-kube-api-access-mg5w6\") pod \"csi-hostpathplugin-4r8tg\" (UID: \"3abcfc85-e792-4ba8-a6c2-db7130b1f423\") " pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.341585 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q4ppq" podStartSLOduration=73.341563373 podStartE2EDuration="1m13.341563373s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.340336273 +0000 UTC m=+128.799763708" watchObservedRunningTime="2026-03-18 13:04:40.341563373 +0000 UTC m=+128.800990798" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.347150 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.350877 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.367479 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.378978 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.397434 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.397797 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:40.897781545 +0000 UTC m=+129.357208970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.398320 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.405379 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.428807 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw"] Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.432755 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.437607 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr"] Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.437738 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.449536 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.459601 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.471437 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.478961 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.499738 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.506473 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.006023128 +0000 UTC m=+129.465450553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.511753 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.515179 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.602208 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.602911 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.102883019 +0000 UTC m=+129.562310444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.623305 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9b88n" podStartSLOduration=23.62328147 podStartE2EDuration="23.62328147s" podCreationTimestamp="2026-03-18 13:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.621890126 +0000 UTC m=+129.081317551" watchObservedRunningTime="2026-03-18 13:04:40.62328147 +0000 UTC m=+129.082708895" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.623482 4912 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.625540 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.699895 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb9sv" podStartSLOduration=73.699876783 podStartE2EDuration="1m13.699876783s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.675631589 +0000 UTC m=+129.135059034" watchObservedRunningTime="2026-03-18 13:04:40.699876783 +0000 UTC m=+129.159304208" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.700926 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-nv7qp" podStartSLOduration=73.700920388 podStartE2EDuration="1m13.700920388s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.699250147 +0000 UTC m=+129.158677592" watchObservedRunningTime="2026-03-18 13:04:40.700920388 +0000 UTC m=+129.160347813" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.708516 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.708966 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.208944421 +0000 UTC m=+129.668371846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.744496 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kq6ff" podStartSLOduration=73.744467535 podStartE2EDuration="1m13.744467535s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.742377075 +0000 UTC m=+129.201804500" watchObservedRunningTime="2026-03-18 13:04:40.744467535 +0000 UTC m=+129.203894950" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.781701 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d7s7p" podStartSLOduration=73.781681811 podStartE2EDuration="1m13.781681811s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:40.779958189 +0000 UTC m=+129.239385624" watchObservedRunningTime="2026-03-18 13:04:40.781681811 +0000 UTC m=+129.241109236" Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.810279 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.810801 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.310778361 +0000 UTC m=+129.770205786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:40 crc kubenswrapper[4912]: I0318 13:04:40.918772 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:40 crc kubenswrapper[4912]: E0318 13:04:40.919289 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.419273141 +0000 UTC m=+129.878700566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.016750 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ksmgs" event={"ID":"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973","Type":"ContainerStarted","Data":"6433c5839b0427828e9ddcef361b0a164ec498f9ad76fdacdcea750377f9e475"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.026703 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.026908 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.5268785 +0000 UTC m=+129.986305915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.027160 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.027586 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.527569307 +0000 UTC m=+129.986996732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.031672 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" event={"ID":"1c4b9717-5ad2-4993-b69c-ad3266d07766","Type":"ContainerStarted","Data":"dd2885993e82568eebbd4fe67b1bbb8bbf814ba29b1544dc135ac29aedc0a7a8"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.045776 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" event={"ID":"0dcc5e62-992c-4554-8296-721247078f5b","Type":"ContainerStarted","Data":"f6b280c294b285142c449b4d88bbe13188a177671cf04b2948440d6794de1a40"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.051959 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" event={"ID":"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a","Type":"ContainerStarted","Data":"cffa25f748ef4154c6175ef5ef0cc43fba29f1509c5cebd6e3ae018a12bf7858"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.054313 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" event={"ID":"2853e2e7-ddce-4741-bb9e-0364ffef8e30","Type":"ContainerStarted","Data":"15e049004785dbe9c3d96f316dafd2f5c12afe7b941cfee812553239aef3065e"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.054346 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" event={"ID":"2853e2e7-ddce-4741-bb9e-0364ffef8e30","Type":"ContainerStarted","Data":"63ec31cb9cbe0c9f4203316c5f2926ed89dc84cf6bbc74b2875705e809453c93"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.072052 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mp64k" podStartSLOduration=74.072011426 podStartE2EDuration="1m14.072011426s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:41.069591608 +0000 UTC m=+129.529019033" watchObservedRunningTime="2026-03-18 13:04:41.072011426 +0000 UTC m=+129.531438851" Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.078967 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bbvtw" event={"ID":"7feb8268-723e-408b-b800-744481779d38","Type":"ContainerStarted","Data":"07b616d91029b1237a645b7a9bbdac7df44dda5057682d781bafc0faa35aa31d"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.093764 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jf6sm" podStartSLOduration=74.093733269 podStartE2EDuration="1m14.093733269s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:41.089256431 +0000 UTC m=+129.548683866" watchObservedRunningTime="2026-03-18 13:04:41.093733269 +0000 UTC m=+129.553160694" Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.099927 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" event={"ID":"9c71dac9-3812-4931-a1ef-0f0796ed93c9","Type":"ContainerStarted","Data":"56b89abf9cb75b418c0484e5470259af4a62fbca23ed27ddc75bc251717ae490"} Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.129124 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.130143 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.630110664 +0000 UTC m=+130.089538089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.234825 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.238869 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.73885011 +0000 UTC m=+130.198277525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.336149 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.336351 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.836320776 +0000 UTC m=+130.295748201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.336579 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.337693 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.837666058 +0000 UTC m=+130.297093483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.423515 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nvzrd"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.438634 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.439206 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:41.93918476 +0000 UTC m=+130.398612185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.541346 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.542054 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.042021345 +0000 UTC m=+130.501448770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.642124 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.642692 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.142668676 +0000 UTC m=+130.602096101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.744570 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.745226 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.245201453 +0000 UTC m=+130.704628878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.844819 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.845985 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.847510 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.347490874 +0000 UTC m=+130.806918299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.889081 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.895680 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mncph"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.909370 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdfkl"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.911926 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz"] Mar 18 13:04:41 crc kubenswrapper[4912]: I0318 13:04:41.946953 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:41 crc kubenswrapper[4912]: E0318 13:04:41.947248 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.447236094 +0000 UTC m=+130.906663519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.048421 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.048831 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.548814568 +0000 UTC m=+131.008241993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.083265 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.164786 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.164866 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.66484053 +0000 UTC m=+131.124267955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.167572 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" event={"ID":"1c4b9717-5ad2-4993-b69c-ad3266d07766","Type":"ContainerStarted","Data":"895df13021f7babeb816ce40e4ed3775a8d9cc516b9747b7aa96bf56b25ecc13"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.167646 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" event={"ID":"1c4b9717-5ad2-4993-b69c-ad3266d07766","Type":"ContainerStarted","Data":"78b0167c5eb15b7b8b2008eb282029ded9f304e408d876f59fa3ee4168dfa5ab"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.187221 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" event={"ID":"a3292cf9-103c-4b5c-8aec-fdb67ca67f1a","Type":"ContainerStarted","Data":"b53b922e41f8837c4716978e62a2ed4138d6b7479ae37b5bf9173b406100a8ee"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.203979 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" event={"ID":"d5f844c0-ca4c-4097-bedd-bbb4323cc717","Type":"ContainerStarted","Data":"a3727f6e0040dfbd1fb6d3c42a7a69289d42b9fcd627cf738086520a5d5af730"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.205682 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zshjp" podStartSLOduration=75.205666862 podStartE2EDuration="1m15.205666862s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.203759036 +0000 UTC m=+130.663186481" watchObservedRunningTime="2026-03-18 13:04:42.205666862 +0000 UTC m=+130.665094287" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.259608 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wgnr" podStartSLOduration=75.259580069 podStartE2EDuration="1m15.259580069s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.257710284 +0000 UTC m=+130.717137719" watchObservedRunningTime="2026-03-18 13:04:42.259580069 +0000 UTC m=+130.719007504" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.267713 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.269304 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.769286443 +0000 UTC m=+131.228713868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.292608 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bbvtw" event={"ID":"7feb8268-723e-408b-b800-744481779d38","Type":"ContainerStarted","Data":"b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.292660 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" event={"ID":"65ef2d7f-a45a-4787-a4e6-441dee567ed0","Type":"ContainerStarted","Data":"a8bdb5beff20bda8abb97f82781699e8aa5acbb5c5b431d87bdcb309ae212689"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.294831 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-bbvtw" podStartSLOduration=75.294811717 podStartE2EDuration="1m15.294811717s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.291631461 +0000 UTC m=+130.751058886" watchObservedRunningTime="2026-03-18 13:04:42.294811717 +0000 UTC m=+130.754239142" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.305367 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mncph" event={"ID":"7b2647fd-d28d-4dff-a4ef-e7839fffd33e","Type":"ContainerStarted","Data":"2dbb04f3ae19c3d929e8580642ae1b2f6e9cad0cd80c769e7e9efce37009bcd6"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.327415 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" event={"ID":"9c71dac9-3812-4931-a1ef-0f0796ed93c9","Type":"ContainerStarted","Data":"91e1096578d24fd8ee0ad3c0a70eb138dd7e2863713af1d09e0c20b85acee182"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.335272 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" event={"ID":"a729c6f8-e561-4f43-8fb9-48834e1873f2","Type":"ContainerStarted","Data":"3ff9414f8446bd9d490d7e307f09b23212ebb5df1fb8b98e9c0f8eaed8f9dbe8"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.337726 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.346001 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.360688 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" podStartSLOduration=75.360667441 podStartE2EDuration="1m15.360667441s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.359586476 +0000 UTC m=+130.819013911" watchObservedRunningTime="2026-03-18 13:04:42.360667441 +0000 UTC m=+130.820094856" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.369624 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.371722 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.871706757 +0000 UTC m=+131.331134182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.376236 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.394501 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.409341 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" event={"ID":"2cb39839-5023-4811-8fc9-0432601dc0d8","Type":"ContainerStarted","Data":"c2f5982942223e89315e69957c181268fa7746ae935f1c41ae757ef75352e97a"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.410885 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.422282 4912 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pdkfk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.422354 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.444148 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ksmgs" event={"ID":"f8606cf1-08bc-4fb4-b86d-ef4ab4f3f973","Type":"ContainerStarted","Data":"c0c32d8a35b32de9a36400896921383b53620061497f8d7afc93ac438a179a9d"} Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.444195 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.446364 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.446851 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nvlj8"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.454547 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:42 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:42 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:42 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.454628 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.465901 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sn298"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.471339 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.474932 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:42.97490826 +0000 UTC m=+131.434335685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.478625 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj"] Mar 18 13:04:42 crc kubenswrapper[4912]: W0318 13:04:42.487175 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2f03ae_9287_4840_bcda_91d0b68849d7.slice/crio-7b57321960abe69869cf0a1e84a2d497b09c29c79bb2fda2f27d47c8228de281 WatchSource:0}: Error finding container 7b57321960abe69869cf0a1e84a2d497b09c29c79bb2fda2f27d47c8228de281: Status 404 returned error can't find the container with id 7b57321960abe69869cf0a1e84a2d497b09c29c79bb2fda2f27d47c8228de281 Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.487981 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2ghlz"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.497614 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.507403 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2"] Mar 18 13:04:42 crc kubenswrapper[4912]: W0318 13:04:42.524235 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c489444_b3c7_4ec6_a959_6dcdb2b83660.slice/crio-13fe4e57f53607880f5e1d5786e3df6dbad392635d4decea418752b96e61a898 WatchSource:0}: Error finding container 13fe4e57f53607880f5e1d5786e3df6dbad392635d4decea418752b96e61a898: Status 404 returned error can't find the container with id 13fe4e57f53607880f5e1d5786e3df6dbad392635d4decea418752b96e61a898 Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.532499 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4r8tg"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.544269 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kfvgw" podStartSLOduration=75.544245728 podStartE2EDuration="1m15.544245728s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.513945089 +0000 UTC m=+130.973372524" watchObservedRunningTime="2026-03-18 13:04:42.544245728 +0000 UTC m=+131.003673153" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.567556 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.573058 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.581116 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.583773 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.076064894 +0000 UTC m=+131.535492319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.622647 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.628245 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.642520 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tjwtn"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.648527 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.658726 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.664671 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.667854 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t75hw"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.669573 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gh7ht"] Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.674192 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.674269 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mncph" podStartSLOduration=25.674202115 podStartE2EDuration="25.674202115s" podCreationTimestamp="2026-03-18 13:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.666472489 +0000 UTC m=+131.125899914" watchObservedRunningTime="2026-03-18 13:04:42.674202115 +0000 UTC m=+131.133629540" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.674743 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.174720588 +0000 UTC m=+131.634148013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.748204 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" podStartSLOduration=75.748172965 podStartE2EDuration="1m15.748172965s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.73051687 +0000 UTC m=+131.189944305" watchObservedRunningTime="2026-03-18 13:04:42.748172965 +0000 UTC m=+131.207600390" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.762845 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ksmgs" podStartSLOduration=26.762812397 podStartE2EDuration="26.762812397s" podCreationTimestamp="2026-03-18 13:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:42.75129538 +0000 UTC m=+131.210722825" watchObservedRunningTime="2026-03-18 13:04:42.762812397 +0000 UTC m=+131.222239822" Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.775499 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.776142 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.276118707 +0000 UTC m=+131.735546132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.878690 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.879213 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.379174007 +0000 UTC m=+131.838601432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.879277 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.880185 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.380176571 +0000 UTC m=+131.839603996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.981480 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.981749 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.481679703 +0000 UTC m=+131.941107128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:42 crc kubenswrapper[4912]: I0318 13:04:42.982157 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:42 crc kubenswrapper[4912]: E0318 13:04:42.982526 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.482511123 +0000 UTC m=+131.941938548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.083530 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.083715 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.583679537 +0000 UTC m=+132.043106962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.083894 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.084350 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.584334583 +0000 UTC m=+132.043762008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.177692 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35734: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.184680 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.185253 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.685233151 +0000 UTC m=+132.144660576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.276649 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35750: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.286443 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.286871 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.786851146 +0000 UTC m=+132.246278571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.377029 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35766: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.392971 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.393484 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.893461799 +0000 UTC m=+132.352889224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.455944 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:43 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:43 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:43 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.456572 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.482641 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" event={"ID":"a729c6f8-e561-4f43-8fb9-48834e1873f2","Type":"ContainerStarted","Data":"17361c280cfbbe6723167d8c5516cc50a2b0e46ad345c042e67760ad01537bb8"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.497233 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.498090 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:43.998066126 +0000 UTC m=+132.457493541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.504995 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35780: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.513301 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" event={"ID":"e1395fca-0450-4427-98ab-c41857892b0a","Type":"ContainerStarted","Data":"7a433c0275eaa4b441652febc13e030dfbb2b1dd9c4481c11088ba26149284e3"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.513399 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" event={"ID":"e1395fca-0450-4427-98ab-c41857892b0a","Type":"ContainerStarted","Data":"941918c09fc9f6d498d8087e37c8072fe2e76eea1f01b05b859460303f001c2e"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.523326 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" event={"ID":"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3","Type":"ContainerStarted","Data":"16ea1c6cca0be729e54d9bf4afae266426c99871d8340ed1be7ec0d30d2454aa"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.566071 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" event={"ID":"3abcfc85-e792-4ba8-a6c2-db7130b1f423","Type":"ContainerStarted","Data":"5bf4e769377eb92386b451feb9df3582e631d3e37c9333fb2df7bfdd006e15d2"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.579352 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" event={"ID":"5f2f03ae-9287-4840-bcda-91d0b68849d7","Type":"ContainerStarted","Data":"34a667ec563a78647ab55a5bd9fe809cb2f9b41308ed63dd5a3b6f11d0974bee"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.579440 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" event={"ID":"5f2f03ae-9287-4840-bcda-91d0b68849d7","Type":"ContainerStarted","Data":"7b57321960abe69869cf0a1e84a2d497b09c29c79bb2fda2f27d47c8228de281"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.581096 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" event={"ID":"65b97390-afd1-41da-9b38-f3467a213007","Type":"ContainerStarted","Data":"41282a9362f819be4f2434ed2c1e53e2535e8b60b870a607f3007f4832c8a4d5"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.586955 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" event={"ID":"7c330cee-6841-4810-8701-53c782ee170b","Type":"ContainerStarted","Data":"715d7185e4a299596c329760f07abd5e15c54754dd88f4e0383862cc8fcbca1a"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.587008 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" event={"ID":"7c330cee-6841-4810-8701-53c782ee170b","Type":"ContainerStarted","Data":"d2ed3c1dc71e923f32a02131d60a40574a7833937b9122bec3010f49e648adf7"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.599778 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" event={"ID":"65ef2d7f-a45a-4787-a4e6-441dee567ed0","Type":"ContainerStarted","Data":"5ec9438ea8143f6b02c1878e38e9c99d56c50cdde232aa367179e5a044ecb24b"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.600712 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.601764 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.101723608 +0000 UTC m=+132.561151033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.610009 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" event={"ID":"389bca57-3d65-4ed4-8b0d-9c09c58ecf99","Type":"ContainerStarted","Data":"9ca52ef970a8b89cd53174e3362454201e044f89267e4da1ee360b295cf4b197"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.610503 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" event={"ID":"389bca57-3d65-4ed4-8b0d-9c09c58ecf99","Type":"ContainerStarted","Data":"363cad3dc040872fe91fca4b649b25657103a290d82f4cbb6cdfc13db35420e3"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.610874 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.618004 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.618736 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35796: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.618697 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.620226 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" event={"ID":"542111b4-ff3a-41fc-a963-0b55c1ace3e9","Type":"ContainerStarted","Data":"753c5a43fece1f28230d3efb6b34d8677308438940752fdff30577ab5bea753c"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.620275 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" event={"ID":"542111b4-ff3a-41fc-a963-0b55c1ace3e9","Type":"ContainerStarted","Data":"a0685be36ee8aca9ce881f697e3a810a22708826f74f355b31b50884e2dea1db"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.640906 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" event={"ID":"39a2121c-5ff0-4ff6-84de-f1354552a568","Type":"ContainerStarted","Data":"186b504d07240304a699ec00f0ead9f63d4c6350fbcc556dccc83cfb2fdc6d12"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.642131 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vm9dz" podStartSLOduration=75.642107287 podStartE2EDuration="1m15.642107287s" podCreationTimestamp="2026-03-18 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.513059514 +0000 UTC m=+131.972486949" watchObservedRunningTime="2026-03-18 13:04:43.642107287 +0000 UTC m=+132.101534712" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.648598 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" event={"ID":"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1","Type":"ContainerStarted","Data":"402c4f98f666a56bfa0c481ea4f7262c813a6e5d2ed9c27f303b5d97e9b03be1"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.657640 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" event={"ID":"ff22e507-73a7-44b1-9eab-c704fb998092","Type":"ContainerStarted","Data":"10a7a8ad9c1fdeb74a65ebd3d4c130f78dbac1564c4d823772a48f356bcb9af8"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.662578 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" event={"ID":"b210beca-0aed-404b-9af5-b704345ce2f8","Type":"ContainerStarted","Data":"8cf396c5be0f1ad9c85fd1fa1fea67fb349958f090004310aea77ada8d12a99c"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.666537 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" event={"ID":"d5f844c0-ca4c-4097-bedd-bbb4323cc717","Type":"ContainerStarted","Data":"fea6cdf4fb3627a067c4cd496d9aa42111112ee10440ea9b47d6c59848717143"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.679782 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" event={"ID":"6c489444-b3c7-4ec6-a959-6dcdb2b83660","Type":"ContainerStarted","Data":"13fe4e57f53607880f5e1d5786e3df6dbad392635d4decea418752b96e61a898"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.681732 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" event={"ID":"5037377a-5754-40b3-8ffc-ef8776d54442","Type":"ContainerStarted","Data":"8fe503fc57987d5f2b2532bf92242a216b46cd4e348707017d0953d68a8dc948"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.681772 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" event={"ID":"5037377a-5754-40b3-8ffc-ef8776d54442","Type":"ContainerStarted","Data":"a12f8487e5a885edcf7e2c09023ca878d98c547899f736a731c3da545214ebc4"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.683020 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.684635 4912 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zzbbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.684680 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podUID="5037377a-5754-40b3-8ffc-ef8776d54442" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.695338 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hdfkl" podStartSLOduration=76.695318736 podStartE2EDuration="1m16.695318736s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.640225006 +0000 UTC m=+132.099652431" watchObservedRunningTime="2026-03-18 13:04:43.695318736 +0000 UTC m=+132.154746161" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.703325 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.704944 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.204930577 +0000 UTC m=+132.664358002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.707696 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podStartSLOduration=76.707664622 podStartE2EDuration="1m16.707664622s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.694954886 +0000 UTC m=+132.154382311" watchObservedRunningTime="2026-03-18 13:04:43.707664622 +0000 UTC m=+132.167092047" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.710710 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" event={"ID":"65fd5ca1-a95e-47d1-9e3c-62178a36eab9","Type":"ContainerStarted","Data":"af321314bc9b7399220dbd5bd3c100fa62d1b7b1c87f7240dd256ce4574ff0ad"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.710756 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.710768 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" event={"ID":"65fd5ca1-a95e-47d1-9e3c-62178a36eab9","Type":"ContainerStarted","Data":"4d5ce656d4bc0a58b02318a4eaf973917d95a19cf3b029019825c0999bfb7bf3"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.714732 4912 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9c657 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.715774 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.726603 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" event={"ID":"61f97d4c-a7a2-4d3c-bb11-a397c93efbad","Type":"ContainerStarted","Data":"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.726659 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" event={"ID":"61f97d4c-a7a2-4d3c-bb11-a397c93efbad","Type":"ContainerStarted","Data":"c995453ba382d1cd0de58d55c50f7379ebf9260c3486f05c089ed59311582f94"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.727707 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.729032 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podStartSLOduration=76.729014363 podStartE2EDuration="1m16.729014363s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.727562213 +0000 UTC m=+132.186989668" watchObservedRunningTime="2026-03-18 13:04:43.729014363 +0000 UTC m=+132.188441788" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.730026 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jsbwx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.730094 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.735556 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" event={"ID":"718af076-f027-4594-8294-53ec36b84f3c","Type":"ContainerStarted","Data":"b9652941f65b158fd0ba0242c4dc87bb28a99ed545fbc984d01319fb5e050100"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.735635 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" event={"ID":"718af076-f027-4594-8294-53ec36b84f3c","Type":"ContainerStarted","Data":"1c029cabe0244038ee5aed74d26d2025f1e7e81229424accc697a8c3502f970f"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.737095 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.750084 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.750148 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.767203 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" event={"ID":"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a","Type":"ContainerStarted","Data":"1adab9420e634f331866adbbd4ba758528f3a4d051941d2cf4a25a2e71414b12"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.767260 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" event={"ID":"3659ebb3-2a51-4ce4-8e7f-b83c91e4bf3a","Type":"ContainerStarted","Data":"b72814f976d30e3d40d75d2034f185986931c5064021f6ec763a34a3b035c1ed"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.779335 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" event={"ID":"2cb39839-5023-4811-8fc9-0432601dc0d8","Type":"ContainerStarted","Data":"1950cf5aba35aaa13bd0c44e93c0531aa026cefc481fb23198a43ab4cf7f15b8"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.783289 4912 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pdkfk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.783372 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.796604 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" event={"ID":"49b7b174-b61f-4835-abf2-f90c11167250","Type":"ContainerStarted","Data":"57d5bd55d9c539c26dcc8c6e5b7d8737c569ee78730e51597a220ca6cb87f59f"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.797053 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" event={"ID":"49b7b174-b61f-4835-abf2-f90c11167250","Type":"ContainerStarted","Data":"5fd604f6be017195e2a570b617ce1b7894853e0e207619377c44dddaf5f9a735"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.805620 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.807491 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.307465808 +0000 UTC m=+132.766893233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.818610 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpn9z" event={"ID":"39c7b2b0-6f20-426b-961d-65878696145f","Type":"ContainerStarted","Data":"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.818673 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpn9z" event={"ID":"39c7b2b0-6f20-426b-961d-65878696145f","Type":"ContainerStarted","Data":"bbdb52150e48248d30a986b090e5eb1b861fb63ca9ce5cc38829b9e1b40b34dd"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.834442 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mncph" event={"ID":"7b2647fd-d28d-4dff-a4ef-e7839fffd33e","Type":"ContainerStarted","Data":"19e63f6ce68884ee741b4b64a1951236d8512dd4917ccb2de8d09ba07dc5b280"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.862663 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2ghlz" event={"ID":"1b34ff88-74eb-45ce-acd4-3b7b272e1747","Type":"ContainerStarted","Data":"5132e6093b44eaf591cebc1965c04878d115a1f5e8d63e0aab8fd470ac942f02"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.862736 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2ghlz" event={"ID":"1b34ff88-74eb-45ce-acd4-3b7b272e1747","Type":"ContainerStarted","Data":"34235665b227b6f8e6e43dcf7746f905f9feffe477556c36831042fad13f62ac"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.865362 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" podStartSLOduration=75.865344274 podStartE2EDuration="1m15.865344274s" podCreationTimestamp="2026-03-18 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.829402106 +0000 UTC m=+132.288829541" watchObservedRunningTime="2026-03-18 13:04:43.865344274 +0000 UTC m=+132.324771709" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.867677 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.868176 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.868383 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.869002 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" podStartSLOduration=76.868980763 podStartE2EDuration="1m16.868980763s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.864569313 +0000 UTC m=+132.323996758" watchObservedRunningTime="2026-03-18 13:04:43.868980763 +0000 UTC m=+132.328408188" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.878221 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35808: no serving certificate available for the kubelet" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.879889 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" event={"ID":"1fc33f6a-4371-412a-9e69-52c81be07685","Type":"ContainerStarted","Data":"bb7dcbca29cfc08483c496f1b85e218c40c02f250b3ea42b60961d3c2e4a387d"} Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.881809 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" gracePeriod=30 Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.894687 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnzgj" podStartSLOduration=76.894666942 podStartE2EDuration="1m16.894666942s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.892687378 +0000 UTC m=+132.352114813" watchObservedRunningTime="2026-03-18 13:04:43.894666942 +0000 UTC m=+132.354094367" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.908667 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:43 crc kubenswrapper[4912]: E0318 13:04:43.910696 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.410672498 +0000 UTC m=+132.870099923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.942612 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podStartSLOduration=76.942591007 podStartE2EDuration="1m16.942591007s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.939810051 +0000 UTC m=+132.399237476" watchObservedRunningTime="2026-03-18 13:04:43.942591007 +0000 UTC m=+132.402018432" Mar 18 13:04:43 crc kubenswrapper[4912]: I0318 13:04:43.996874 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2ghlz" podStartSLOduration=76.996841534 podStartE2EDuration="1m16.996841534s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:43.992252429 +0000 UTC m=+132.451679874" watchObservedRunningTime="2026-03-18 13:04:43.996841534 +0000 UTC m=+132.456268969" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.010196 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.010628 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.510603397 +0000 UTC m=+132.970030822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.012993 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.015650 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.515617224 +0000 UTC m=+132.975044829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.036788 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vpn9z" podStartSLOduration=77.036760379 podStartE2EDuration="1m17.036760379s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:44.033267454 +0000 UTC m=+132.492694899" watchObservedRunningTime="2026-03-18 13:04:44.036760379 +0000 UTC m=+132.496187804" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.091874 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xpph2" podStartSLOduration=77.091854169 podStartE2EDuration="1m17.091854169s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:44.090534783 +0000 UTC m=+132.549962218" watchObservedRunningTime="2026-03-18 13:04:44.091854169 +0000 UTC m=+132.551281594" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.116819 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.118135 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.618104283 +0000 UTC m=+133.077531708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.152441 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35814: no serving certificate available for the kubelet" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.218861 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.219297 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.719282878 +0000 UTC m=+133.178710303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.319929 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.320289 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.820242886 +0000 UTC m=+133.279670311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.320392 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.320795 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.820778681 +0000 UTC m=+133.280206116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.425266 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.425678 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.925647955 +0000 UTC m=+133.385075370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.426105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.426684 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:44.926665183 +0000 UTC m=+133.386092608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.442523 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:44 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:44 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:44 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.442594 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.527182 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.527441 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.027400855 +0000 UTC m=+133.486828280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.527544 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.527853 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.027837927 +0000 UTC m=+133.487265352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.547422 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35830: no serving certificate available for the kubelet" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.628646 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.628925 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.128884458 +0000 UTC m=+133.588311893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.628998 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.629466 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.129454913 +0000 UTC m=+133.588882338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.730585 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.730874 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.230825253 +0000 UTC m=+133.690252698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.731230 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.731714 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.231699077 +0000 UTC m=+133.691126502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.832621 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.832826 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.332789308 +0000 UTC m=+133.792216733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.833057 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.833460 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.333443966 +0000 UTC m=+133.792871391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.887817 4912 generic.go:334] "Generic (PLEG): container finished" podID="b210beca-0aed-404b-9af5-b704345ce2f8" containerID="7a411dad9b97f58de8fcb8612fa3844912948c819c79a2890fc99e9b52f5ebf7" exitCode=0 Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.887873 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" event={"ID":"b210beca-0aed-404b-9af5-b704345ce2f8","Type":"ContainerDied","Data":"7a411dad9b97f58de8fcb8612fa3844912948c819c79a2890fc99e9b52f5ebf7"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.892104 4912 generic.go:334] "Generic (PLEG): container finished" podID="39a2121c-5ff0-4ff6-84de-f1354552a568" containerID="df62798d064758317b237bd20939c02a7a1cce19c79da1f3ab31cca5a2651566" exitCode=0 Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.892823 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" event={"ID":"39a2121c-5ff0-4ff6-84de-f1354552a568","Type":"ContainerDied","Data":"df62798d064758317b237bd20939c02a7a1cce19c79da1f3ab31cca5a2651566"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.897691 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" event={"ID":"e1395fca-0450-4427-98ab-c41857892b0a","Type":"ContainerStarted","Data":"93e671b01e98572967b7c1b24c645ac794690ee73899c06b6c42f134ffe2d528"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.902249 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" event={"ID":"1fc33f6a-4371-412a-9e69-52c81be07685","Type":"ContainerStarted","Data":"c7a443a4e593358a321de29a6f96caca2c0c397ea8c4b9ad564ca039c5d0f66e"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.907521 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" event={"ID":"65b97390-afd1-41da-9b38-f3467a213007","Type":"ContainerStarted","Data":"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.907766 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.909336 4912 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-q8fqp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.909399 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" podUID="65b97390-afd1-41da-9b38-f3467a213007" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.911202 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" event={"ID":"542111b4-ff3a-41fc-a963-0b55c1ace3e9","Type":"ContainerStarted","Data":"6f356741f0a9da028c7d643afd3630b7cacba634096fe9f31f89cfd48b9b6609"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.918236 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" event={"ID":"7c330cee-6841-4810-8701-53c782ee170b","Type":"ContainerStarted","Data":"e9084a9414985425c25e48b9da29cf08c293081128f5df532750a86b95bc126b"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.920541 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" event={"ID":"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3","Type":"ContainerStarted","Data":"71691bab47dd1299c204f36593b2e87a78feafb957d24ebcddb7db7fb06086b0"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.920799 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.922697 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.922760 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.930675 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" event={"ID":"6c489444-b3c7-4ec6-a959-6dcdb2b83660","Type":"ContainerStarted","Data":"537ec74613b2ec182471a9c5a63a5f515d81c202d42bbb7233b252b1872213fd"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.930742 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" event={"ID":"6c489444-b3c7-4ec6-a959-6dcdb2b83660","Type":"ContainerStarted","Data":"32e9a733334ee4e1c09dc5e1ecf923b83e77d69842e584f252e316b2370b78d2"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.936600 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" event={"ID":"3abcfc85-e792-4ba8-a6c2-db7130b1f423","Type":"ContainerStarted","Data":"2b940b50838cd8809178bdfaff64a279538d4d859a4d77a6fbc2de592058e2a1"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.937608 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:44 crc kubenswrapper[4912]: E0318 13:04:44.938136 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.438108525 +0000 UTC m=+133.897535950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.950658 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" event={"ID":"5f2f03ae-9287-4840-bcda-91d0b68849d7","Type":"ContainerStarted","Data":"192e1c96eff03e0f62710194722b73674fcb0688acd78aa035037f1792a605b1"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.950787 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.962339 4912 generic.go:334] "Generic (PLEG): container finished" podID="ff22e507-73a7-44b1-9eab-c704fb998092" containerID="282fe885a1d5cf5e42854153a46ccfae32b88ad64803bc952a018193fc87132b" exitCode=0 Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.962674 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" event={"ID":"ff22e507-73a7-44b1-9eab-c704fb998092","Type":"ContainerDied","Data":"282fe885a1d5cf5e42854153a46ccfae32b88ad64803bc952a018193fc87132b"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.983917 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" event={"ID":"6508b29f-a1b9-4a3a-aa9d-312c53ed90b1","Type":"ContainerStarted","Data":"624b458c2e5dfb08faaacbcfcd21f90fe576c6737a1681fccd7e882f45cc87ac"} Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.985084 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jsbwx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.985144 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.985722 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.985778 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.993805 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 13:04:44 crc kubenswrapper[4912]: I0318 13:04:44.997075 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.005089 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" podStartSLOduration=78.005060748 podStartE2EDuration="1m18.005060748s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.00292509 +0000 UTC m=+133.462352545" watchObservedRunningTime="2026-03-18 13:04:45.005060748 +0000 UTC m=+133.464488173" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.016028 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.039404 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.047918 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.547895704 +0000 UTC m=+134.007323129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.061753 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.122063 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f7ql4" podStartSLOduration=78.122027812 podStartE2EDuration="1m18.122027812s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.115392741 +0000 UTC m=+133.574820176" watchObservedRunningTime="2026-03-18 13:04:45.122027812 +0000 UTC m=+133.581455247" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.135152 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-24gsx" podStartSLOduration=78.135129409 podStartE2EDuration="1m18.135129409s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.075194637 +0000 UTC m=+133.534622072" watchObservedRunningTime="2026-03-18 13:04:45.135129409 +0000 UTC m=+133.594556834" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.142031 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.144807 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.644779081 +0000 UTC m=+134.104206506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.246635 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nvlj8" podStartSLOduration=78.246599093 podStartE2EDuration="1m18.246599093s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.234594086 +0000 UTC m=+133.694021551" watchObservedRunningTime="2026-03-18 13:04:45.246599093 +0000 UTC m=+133.706026518" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.248647 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.249172 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.749144622 +0000 UTC m=+134.208572047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.303393 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podStartSLOduration=78.303356368 podStartE2EDuration="1m18.303356368s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.292522153 +0000 UTC m=+133.751949598" watchObservedRunningTime="2026-03-18 13:04:45.303356368 +0000 UTC m=+133.762783813" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.330590 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35846: no serving certificate available for the kubelet" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.354793 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.355298 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.855278181 +0000 UTC m=+134.314705606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.376784 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sn298" podStartSLOduration=78.376762806 podStartE2EDuration="1m18.376762806s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.355644261 +0000 UTC m=+133.815071686" watchObservedRunningTime="2026-03-18 13:04:45.376762806 +0000 UTC m=+133.836190231" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.430649 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vx6l" podStartSLOduration=78.430620062 podStartE2EDuration="1m18.430620062s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.428086003 +0000 UTC m=+133.887513448" watchObservedRunningTime="2026-03-18 13:04:45.430620062 +0000 UTC m=+133.890047487" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.454751 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:45 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:45 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:45 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.454833 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.456482 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.456892 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:45.956875817 +0000 UTC m=+134.416303242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.559062 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.559463 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.059442009 +0000 UTC m=+134.518869434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.661901 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.662330 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.1623174 +0000 UTC m=+134.621744825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.697161 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podStartSLOduration=78.697138207 podStartE2EDuration="1m18.697138207s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.640502316 +0000 UTC m=+134.099929741" watchObservedRunningTime="2026-03-18 13:04:45.697138207 +0000 UTC m=+134.156565632" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.759769 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gh7ht" podStartSLOduration=77.759745682 podStartE2EDuration="1m17.759745682s" podCreationTimestamp="2026-03-18 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:45.757508681 +0000 UTC m=+134.216936106" watchObservedRunningTime="2026-03-18 13:04:45.759745682 +0000 UTC m=+134.219173107" Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.762920 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.763274 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.263260007 +0000 UTC m=+134.722687432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.865302 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.865937 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.365918232 +0000 UTC m=+134.825345657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.966846 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:45 crc kubenswrapper[4912]: E0318 13:04:45.967351 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.467327972 +0000 UTC m=+134.926755397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.992966 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:04:45 crc kubenswrapper[4912]: I0318 13:04:45.993077 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.018077 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" event={"ID":"39a2121c-5ff0-4ff6-84de-f1354552a568","Type":"ContainerStarted","Data":"902b4801a45f1a5e8d69fec2609a83beed5088682cc7e10454901374cdde32aa"} Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.047171 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" event={"ID":"ff22e507-73a7-44b1-9eab-c704fb998092","Type":"ContainerStarted","Data":"1b4e5218ac03bfbd62317e391933f9ef90f802746124cf9c50635e1b5575e9d4"} Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.073549 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.075221 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.575202389 +0000 UTC m=+135.034629814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.101049 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" podStartSLOduration=79.101009931 podStartE2EDuration="1m19.101009931s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:46.093437745 +0000 UTC m=+134.552865180" watchObservedRunningTime="2026-03-18 13:04:46.101009931 +0000 UTC m=+134.560437356" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.127164 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" event={"ID":"b210beca-0aed-404b-9af5-b704345ce2f8","Type":"ContainerStarted","Data":"70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098"} Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.129701 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.129748 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jsbwx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.129797 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.129804 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.129889 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.131532 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.131611 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.142472 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.171066 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.177624 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.178079 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.178542 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9csh\" (UniqueName: \"kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.178576 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.179467 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.679431846 +0000 UTC m=+135.138859271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.285709 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.285778 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.285844 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9csh\" (UniqueName: \"kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.285873 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.286435 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.288818 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.788800553 +0000 UTC m=+135.248227978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.298858 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.300051 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podStartSLOduration=79.299998388 podStartE2EDuration="1m19.299998388s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:46.223793564 +0000 UTC m=+134.683221009" watchObservedRunningTime="2026-03-18 13:04:46.299998388 +0000 UTC m=+134.759425813" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.331730 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.333112 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.333768 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.353945 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.388134 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.388447 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.888404265 +0000 UTC m=+135.347831690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.388542 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.388624 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.388980 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.389158 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7gs\" (UniqueName: \"kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.389703 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.8896908 +0000 UTC m=+135.349118235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.394778 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.404899 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9csh\" (UniqueName: \"kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh\") pod \"certified-operators-k6ggq\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.430748 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.431838 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.434646 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.444100 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.450155 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:46 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:46 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:46 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.450214 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.468181 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.469699 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.470396 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.471265 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.490901 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.491818 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492108 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492159 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7gs\" (UniqueName: \"kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.492206 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.992177819 +0000 UTC m=+135.451605244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492260 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492333 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492479 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.492585 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.493012 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:46.992992112 +0000 UTC m=+135.452419537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.496587 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.499668 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.551797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7gs\" (UniqueName: \"kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs\") pod \"community-operators-r9dkt\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.598970 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.599433 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.599499 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.599537 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.599640 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.599684 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tf9\" (UniqueName: \"kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.599831 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.099804419 +0000 UTC m=+135.559231844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.600309 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.651268 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.652514 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.663001 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.672480 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701686 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701770 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701804 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701833 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701920 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tf9\" (UniqueName: \"kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.701955 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rtr\" (UniqueName: \"kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.702819 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.703173 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.704202 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.20417393 +0000 UTC m=+135.663601525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.715599 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.731182 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.765487 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tf9\" (UniqueName: \"kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9\") pod \"certified-operators-4tf6g\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.765970 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35858: no serving certificate available for the kubelet" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.776246 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.815137 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.815504 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rtr\" (UniqueName: \"kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.815551 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.815578 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.816274 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.816426 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.316399125 +0000 UTC m=+135.775826550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.817251 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.823242 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.874173 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.908305 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rtr\" (UniqueName: \"kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr\") pod \"community-operators-gl7cm\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:46 crc kubenswrapper[4912]: I0318 13:04:46.919183 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:46 crc kubenswrapper[4912]: E0318 13:04:46.919620 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.419602235 +0000 UTC m=+135.879029660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.012944 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.024295 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.024758 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.524736057 +0000 UTC m=+135.984163482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.132831 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.133719 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.633705123 +0000 UTC m=+136.093132548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.221355 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" event={"ID":"3abcfc85-e792-4ba8-a6c2-db7130b1f423","Type":"ContainerStarted","Data":"947ce3e9cc2d3071e62a7cc643e1e4a9b3777d103e6735a13942d3cd0f108914"} Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.234555 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.234934 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.734914758 +0000 UTC m=+136.194342193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.253272 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" event={"ID":"ff22e507-73a7-44b1-9eab-c704fb998092","Type":"ContainerStarted","Data":"22e0a1ae1817d505651fb8c1a97aa2c615a868c56c370c3a6226fa998270d934"} Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.362446 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.373232 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.873193272 +0000 UTC m=+136.332620697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.480870 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.481781 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:47.981754208 +0000 UTC m=+136.441181633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.483380 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:47 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:47 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:47 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.483494 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.582871 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.583288 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.083261631 +0000 UTC m=+136.542689046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.612821 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" podStartSLOduration=80.612796664 podStartE2EDuration="1m20.612796664s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:47.311128763 +0000 UTC m=+135.770556188" watchObservedRunningTime="2026-03-18 13:04:47.612796664 +0000 UTC m=+136.072224089" Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.620196 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.620488 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" containerID="cri-o://1950cf5aba35aaa13bd0c44e93c0531aa026cefc481fb23198a43ab4cf7f15b8" gracePeriod=30 Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.693107 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.693622 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.193603953 +0000 UTC m=+136.653031378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.705426 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.705754 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerName="route-controller-manager" containerID="cri-o://af321314bc9b7399220dbd5bd3c100fa62d1b7b1c87f7240dd256ce4574ff0ad" gracePeriod=30 Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.797018 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.797190 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.797619 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.297600194 +0000 UTC m=+136.757027619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.858829 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.904239 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.904658 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.404627198 +0000 UTC m=+136.864054623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.927406 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.945196 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.965732 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:47 crc kubenswrapper[4912]: E0318 13:04:47.965824 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:04:47 crc kubenswrapper[4912]: I0318 13:04:47.984643 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.015216 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: E0318 13:04:48.016569 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.516548675 +0000 UTC m=+136.975976100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.070785 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.071995 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.092880 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.117069 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:48 crc kubenswrapper[4912]: E0318 13:04:48.117276 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.617239336 +0000 UTC m=+137.076666761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.117603 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5sc\" (UniqueName: \"kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.117734 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.117934 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.122104 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: E0318 13:04:48.122910 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.622888579 +0000 UTC m=+137.082316004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.131493 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.196509 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.222070 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.223649 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.223946 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.223997 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.224054 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5sc\" (UniqueName: \"kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: E0318 13:04:48.224481 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.724465984 +0000 UTC m=+137.183893409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.224976 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.225393 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.252710 4912 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.275959 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5sc\" (UniqueName: \"kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc\") pod \"redhat-marketplace-65p9d\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.299113 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerStarted","Data":"c4e82dfe6ed5e99682af78ab2e76de920895f93dc15cac0c3c2f817f7748307c"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.306814 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.316859 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerStarted","Data":"892c63f70ffdb84ca74f0ffbd1f49041e818f0194ff1762d06d47cb4d8190ed1"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.317100 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.321343 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.326861 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.327127 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.328652 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.328821 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.328925 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: E0318 13:04:48.329367 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 13:04:48.82935139 +0000 UTC m=+137.288778815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sq25z" (UID: "1e116845-2d89-48e3-b832-584be4553fd3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.352478 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerStarted","Data":"5e8b12566b2e75153fd64dea4c3174a65ba93eceddf2d2a56a1a893f5e1ab055"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.365554 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" event={"ID":"3abcfc85-e792-4ba8-a6c2-db7130b1f423","Type":"ContainerStarted","Data":"adc6b4d2bc4947b57f1212a2b2dced2c2db7b2f7b38826e3c4358c1912b11638"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.365616 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" event={"ID":"3abcfc85-e792-4ba8-a6c2-db7130b1f423","Type":"ContainerStarted","Data":"fb52deb8af0fb59338c7c647b67e1156880555a6a5c4bb807f66905a50f6d60e"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.390702 4912 generic.go:334] "Generic (PLEG): container finished" podID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerID="af321314bc9b7399220dbd5bd3c100fa62d1b7b1c87f7240dd256ce4574ff0ad" exitCode=0 Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.390843 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" event={"ID":"65fd5ca1-a95e-47d1-9e3c-62178a36eab9","Type":"ContainerDied","Data":"af321314bc9b7399220dbd5bd3c100fa62d1b7b1c87f7240dd256ce4574ff0ad"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.408998 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerStarted","Data":"f1524a86a7771dceff22d226fb2b514baf25051b456a05e49d115530876d5465"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.421779 4912 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T13:04:48.252757694Z","Handler":null,"Name":""} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.427330 4912 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.427368 4912 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.427448 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.429017 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.431279 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.431653 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.431763 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.432430 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.449834 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:48 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:48 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:48 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.449916 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.450019 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.451463 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.455728 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" podStartSLOduration=31.455680078 podStartE2EDuration="31.455680078s" podCreationTimestamp="2026-03-18 13:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:48.437930225 +0000 UTC m=+136.897357680" watchObservedRunningTime="2026-03-18 13:04:48.455680078 +0000 UTC m=+136.915107503" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.470851 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.472060 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.478931 4912 generic.go:334] "Generic (PLEG): container finished" podID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerID="1950cf5aba35aaa13bd0c44e93c0531aa026cefc481fb23198a43ab4cf7f15b8" exitCode=0 Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.482606 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" event={"ID":"2cb39839-5023-4811-8fc9-0432601dc0d8","Type":"ContainerDied","Data":"1950cf5aba35aaa13bd0c44e93c0531aa026cefc481fb23198a43ab4cf7f15b8"} Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.498303 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.536150 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.536283 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.536355 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d58k4\" (UniqueName: \"kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.536396 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.638586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.639079 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.639133 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d58k4\" (UniqueName: \"kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.639366 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.640198 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.641177 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.654241 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.670341 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d58k4\" (UniqueName: \"kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4\") pod \"redhat-marketplace-hj4gk\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.699995 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.704511 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.704569 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.739934 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert\") pod \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740016 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config\") pod \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740093 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") pod \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740112 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config\") pod \"2cb39839-5023-4811-8fc9-0432601dc0d8\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740146 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca\") pod \"2cb39839-5023-4811-8fc9-0432601dc0d8\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740162 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca\") pod \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\" (UID: \"65fd5ca1-a95e-47d1-9e3c-62178a36eab9\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740522 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles\") pod \"2cb39839-5023-4811-8fc9-0432601dc0d8\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740557 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert\") pod \"2cb39839-5023-4811-8fc9-0432601dc0d8\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.740636 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlgz9\" (UniqueName: \"kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9\") pod \"2cb39839-5023-4811-8fc9-0432601dc0d8\" (UID: \"2cb39839-5023-4811-8fc9-0432601dc0d8\") " Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.743058 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2cb39839-5023-4811-8fc9-0432601dc0d8" (UID: "2cb39839-5023-4811-8fc9-0432601dc0d8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.747898 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cb39839-5023-4811-8fc9-0432601dc0d8" (UID: "2cb39839-5023-4811-8fc9-0432601dc0d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.748305 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config" (OuterVolumeSpecName: "config") pod "2cb39839-5023-4811-8fc9-0432601dc0d8" (UID: "2cb39839-5023-4811-8fc9-0432601dc0d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.748997 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.749021 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.749069 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cb39839-5023-4811-8fc9-0432601dc0d8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.749328 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config" (OuterVolumeSpecName: "config") pod "65fd5ca1-a95e-47d1-9e3c-62178a36eab9" (UID: "65fd5ca1-a95e-47d1-9e3c-62178a36eab9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.751157 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca" (OuterVolumeSpecName: "client-ca") pod "65fd5ca1-a95e-47d1-9e3c-62178a36eab9" (UID: "65fd5ca1-a95e-47d1-9e3c-62178a36eab9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.752725 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k" (OuterVolumeSpecName: "kube-api-access-m7t5k") pod "65fd5ca1-a95e-47d1-9e3c-62178a36eab9" (UID: "65fd5ca1-a95e-47d1-9e3c-62178a36eab9"). InnerVolumeSpecName "kube-api-access-m7t5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.753512 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "65fd5ca1-a95e-47d1-9e3c-62178a36eab9" (UID: "65fd5ca1-a95e-47d1-9e3c-62178a36eab9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.757549 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9" (OuterVolumeSpecName: "kube-api-access-zlgz9") pod "2cb39839-5023-4811-8fc9-0432601dc0d8" (UID: "2cb39839-5023-4811-8fc9-0432601dc0d8"). InnerVolumeSpecName "kube-api-access-zlgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.758204 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cb39839-5023-4811-8fc9-0432601dc0d8" (UID: "2cb39839-5023-4811-8fc9-0432601dc0d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.772408 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sq25z\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.776573 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.837529 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.850609 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7t5k\" (UniqueName: \"kubernetes.io/projected/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-kube-api-access-m7t5k\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.851105 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.851119 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb39839-5023-4811-8fc9-0432601dc0d8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.851132 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlgz9\" (UniqueName: \"kubernetes.io/projected/2cb39839-5023-4811-8fc9-0432601dc0d8-kube-api-access-zlgz9\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.851147 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.851159 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65fd5ca1-a95e-47d1-9e3c-62178a36eab9-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.965376 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 13:04:48 crc kubenswrapper[4912]: I0318 13:04:48.974187 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.022215 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.116059 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.215018 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:04:49 crc kubenswrapper[4912]: E0318 13:04:49.215318 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.215330 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: E0318 13:04:49.215351 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerName="route-controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.215358 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerName="route-controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.215454 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" containerName="route-controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.215470 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" containerName="controller-manager" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.216314 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.221848 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.229456 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.259517 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.259868 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmvb\" (UniqueName: \"kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.259994 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.288169 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.361707 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmvb\" (UniqueName: \"kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.361796 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.361857 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.362535 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.362836 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.379634 4912 ???:1] "http: TLS handshake error from 192.168.126.11:35864: no serving certificate available for the kubelet" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.386780 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmvb\" (UniqueName: \"kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb\") pod \"redhat-operators-n2x6x\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.431297 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.450638 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:49 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:49 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:49 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.450721 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.506337 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerStarted","Data":"315096c53f073fd8224bcb2f7aec68350379fe0f5a4deb1179da6b4cae22d818"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.515205 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15fbed0f-e4cb-400c-8391-a5985bcbd76d","Type":"ContainerStarted","Data":"9edcb7f96278d4573a3b44754677be9bdd466dc047685a4fc49ae383aa9b3f90"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.517027 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15fbed0f-e4cb-400c-8391-a5985bcbd76d","Type":"ContainerStarted","Data":"9847277f5a15eb09c405f97b10d0a7c6cd58d32b021ceb248268af88f602b28c"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.519266 4912 generic.go:334] "Generic (PLEG): container finished" podID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerID="e18cd9a9c97e2b3dc207b9ce9f8ad044b620f79fa50d25d7f7cc798b5b19194a" exitCode=0 Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.519308 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerDied","Data":"e18cd9a9c97e2b3dc207b9ce9f8ad044b620f79fa50d25d7f7cc798b5b19194a"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.522963 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.523643 4912 generic.go:334] "Generic (PLEG): container finished" podID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerID="66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338" exitCode=0 Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.524022 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerDied","Data":"66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.525596 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" event={"ID":"1e116845-2d89-48e3-b832-584be4553fd3","Type":"ContainerStarted","Data":"5cd768257b5a38310177368d47ef1a10ad63fc3f1724fb425a036417d8187552"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.534475 4912 generic.go:334] "Generic (PLEG): container finished" podID="d5f844c0-ca4c-4097-bedd-bbb4323cc717" containerID="fea6cdf4fb3627a067c4cd496d9aa42111112ee10440ea9b47d6c59848717143" exitCode=0 Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.534595 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" event={"ID":"d5f844c0-ca4c-4097-bedd-bbb4323cc717","Type":"ContainerDied","Data":"fea6cdf4fb3627a067c4cd496d9aa42111112ee10440ea9b47d6c59848717143"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.541408 4912 generic.go:334] "Generic (PLEG): container finished" podID="3af91020-4095-48f0-9457-b171de576fe0" containerID="9d57efb31fecfd9de22088df406050103e36b7847eaffe89e0da353ee20f21d8" exitCode=0 Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.541584 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerDied","Data":"9d57efb31fecfd9de22088df406050103e36b7847eaffe89e0da353ee20f21d8"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.544719 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.544697833 podStartE2EDuration="3.544697833s" podCreationTimestamp="2026-03-18 13:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:49.542104833 +0000 UTC m=+138.001532278" watchObservedRunningTime="2026-03-18 13:04:49.544697833 +0000 UTC m=+138.004125258" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.598515 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" event={"ID":"65fd5ca1-a95e-47d1-9e3c-62178a36eab9","Type":"ContainerDied","Data":"4d5ce656d4bc0a58b02318a4eaf973917d95a19cf3b029019825c0999bfb7bf3"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.598603 4912 scope.go:117] "RemoveContainer" containerID="af321314bc9b7399220dbd5bd3c100fa62d1b7b1c87f7240dd256ce4574ff0ad" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.598830 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.608757 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerStarted","Data":"f98f62995c2fdc8fb22e64a3c977fb8148e9252a6634165d0e551d53c2577dc5"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.640607 4912 generic.go:334] "Generic (PLEG): container finished" podID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerID="40379c5a2d8a39c404ad21e27ef3c63e96a5a8d92f75d1b7c86780fb2c54cccd" exitCode=0 Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.640887 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerDied","Data":"40379c5a2d8a39c404ad21e27ef3c63e96a5a8d92f75d1b7c86780fb2c54cccd"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.643026 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.646105 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.652830 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd","Type":"ContainerStarted","Data":"93e76d5208dbb6223b405bf13a9b1001e11ccc939a5e75d563744bc0539c5150"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.668648 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qbg\" (UniqueName: \"kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.668715 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.668732 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.668770 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.680017 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" event={"ID":"2cb39839-5023-4811-8fc9-0432601dc0d8","Type":"ContainerDied","Data":"c2f5982942223e89315e69957c181268fa7746ae935f1c41ae757ef75352e97a"} Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.687997 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pdkfk" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.713802 4912 scope.go:117] "RemoveContainer" containerID="1950cf5aba35aaa13bd0c44e93c0531aa026cefc481fb23198a43ab4cf7f15b8" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.771582 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.773628 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qbg\" (UniqueName: \"kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.773676 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.774050 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.773286 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.792674 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.798279 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9c657"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.809380 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qbg\" (UniqueName: \"kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg\") pod \"redhat-operators-crcqh\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.832702 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.856178 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.856154091 podStartE2EDuration="1.856154091s" podCreationTimestamp="2026-03-18 13:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:49.853407967 +0000 UTC m=+138.312835402" watchObservedRunningTime="2026-03-18 13:04:49.856154091 +0000 UTC m=+138.315581516" Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.867331 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:49 crc kubenswrapper[4912]: I0318 13:04:49.873403 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pdkfk"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.052058 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.052147 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.054612 4912 patch_prober.go:28] interesting pod/console-f9d7485db-vpn9z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.054700 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vpn9z" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.067979 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.135917 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.198227 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.199650 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.202575 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.203684 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.206077 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.206481 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.206508 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.206737 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.206996 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.207498 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.207711 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.207882 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.208069 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.208232 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.208283 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.208293 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.216558 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.216642 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.222476 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.249767 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb39839-5023-4811-8fc9-0432601dc0d8" path="/var/lib/kubelet/pods/2cb39839-5023-4811-8fc9-0432601dc0d8/volumes" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.258142 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fd5ca1-a95e-47d1-9e3c-62178a36eab9" path="/var/lib/kubelet/pods/65fd5ca1-a95e-47d1-9e3c-62178a36eab9/volumes" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.259221 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.281575 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.281657 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.281685 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjf9\" (UniqueName: \"kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.281748 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.281766 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.302329 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.302410 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.302462 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.302528 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.318572 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.319325 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.342428 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.384948 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.387558 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcn4k\" (UniqueName: \"kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.387744 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.387865 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.387913 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjf9\" (UniqueName: \"kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.387944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.388052 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.388095 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.388164 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.388193 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.400084 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.400363 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.426903 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.433359 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.438397 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.443415 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjf9\" (UniqueName: \"kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9\") pod \"controller-manager-89dffcdf6-zsqrh\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.449944 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:50 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:50 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:50 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.450026 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.498738 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcn4k\" (UniqueName: \"kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.499439 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.500258 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.502119 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.504676 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.504940 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.505204 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.515688 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.518380 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.519068 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcn4k\" (UniqueName: \"kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k\") pod \"route-controller-manager-67b479887-qj89j\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.533274 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.534328 4912 patch_prober.go:28] interesting pod/apiserver-76f77b778f-t75hw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]log ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]etcd ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/max-in-flight-filter ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 18 13:04:50 crc kubenswrapper[4912]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 18 13:04:50 crc kubenswrapper[4912]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/project.openshift.io-projectcache ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-startinformers ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 18 13:04:50 crc kubenswrapper[4912]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 13:04:50 crc kubenswrapper[4912]: livez check failed Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.534465 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" podUID="ff22e507-73a7-44b1-9eab-c704fb998092" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.556763 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.731634 4912 generic.go:334] "Generic (PLEG): container finished" podID="15fbed0f-e4cb-400c-8391-a5985bcbd76d" containerID="9edcb7f96278d4573a3b44754677be9bdd466dc047685a4fc49ae383aa9b3f90" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.731815 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15fbed0f-e4cb-400c-8391-a5985bcbd76d","Type":"ContainerDied","Data":"9edcb7f96278d4573a3b44754677be9bdd466dc047685a4fc49ae383aa9b3f90"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.763638 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" event={"ID":"1e116845-2d89-48e3-b832-584be4553fd3","Type":"ContainerStarted","Data":"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.767583 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.772915 4912 generic.go:334] "Generic (PLEG): container finished" podID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerID="402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.773009 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerDied","Data":"402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.788287 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd","Type":"ContainerDied","Data":"6a43a4b30bbc5d66ef9e94c4965c9fd14e445c03dd8d7792a86c92f264413de5"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.788375 4912 generic.go:334] "Generic (PLEG): container finished" podID="1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" containerID="6a43a4b30bbc5d66ef9e94c4965c9fd14e445c03dd8d7792a86c92f264413de5" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.812743 4912 generic.go:334] "Generic (PLEG): container finished" podID="840ef508-c05b-4b3b-bb16-e15729003be1" containerID="b2f1258846ac7320a95e3ab94ca1e29e0c5ff644720cb146d4aec202fbd8c070" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.813757 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerDied","Data":"b2f1258846ac7320a95e3ab94ca1e29e0c5ff644720cb146d4aec202fbd8c070"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.816412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerStarted","Data":"53a468572fffa5642cc7febdcc839514ca0b215e4624e8562c56463540caf78c"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.815881 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" podStartSLOduration=83.815865546 podStartE2EDuration="1m23.815865546s" podCreationTimestamp="2026-03-18 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:50.811489687 +0000 UTC m=+139.270917112" watchObservedRunningTime="2026-03-18 13:04:50.815865546 +0000 UTC m=+139.275292971" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.828134 4912 generic.go:334] "Generic (PLEG): container finished" podID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerID="8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.828260 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerDied","Data":"8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.828302 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerStarted","Data":"05f93d4453d99ee553eb4c01ac2655bc8fa0b2428727af6a75bbd1715a73badf"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.850084 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c189f26-c791-48a6-a060-9982c8666243" containerID="7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc" exitCode=0 Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.850223 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerDied","Data":"7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc"} Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.874963 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.894736 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:04:50 crc kubenswrapper[4912]: I0318 13:04:50.974722 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.237242 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.262082 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.323245 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") pod \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.323433 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume\") pod \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.323528 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume\") pod \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\" (UID: \"d5f844c0-ca4c-4097-bedd-bbb4323cc717\") " Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.324978 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5f844c0-ca4c-4097-bedd-bbb4323cc717" (UID: "d5f844c0-ca4c-4097-bedd-bbb4323cc717"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.333996 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5f844c0-ca4c-4097-bedd-bbb4323cc717" (UID: "d5f844c0-ca4c-4097-bedd-bbb4323cc717"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.341337 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs" (OuterVolumeSpecName: "kube-api-access-htjvs") pod "d5f844c0-ca4c-4097-bedd-bbb4323cc717" (UID: "d5f844c0-ca4c-4097-bedd-bbb4323cc717"). InnerVolumeSpecName "kube-api-access-htjvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.428248 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f844c0-ca4c-4097-bedd-bbb4323cc717-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.428295 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htjvs\" (UniqueName: \"kubernetes.io/projected/d5f844c0-ca4c-4097-bedd-bbb4323cc717-kube-api-access-htjvs\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.428317 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f844c0-ca4c-4097-bedd-bbb4323cc717-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.442535 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:51 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:51 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:51 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.442625 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.593697 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ksmgs" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.628813 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.628786384 podStartE2EDuration="628.786384ms" podCreationTimestamp="2026-03-18 13:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:51.622852333 +0000 UTC m=+140.082279768" watchObservedRunningTime="2026-03-18 13:04:51.628786384 +0000 UTC m=+140.088213809" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.865446 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" event={"ID":"f7b11f26-7ebf-493c-abe5-e1f792a977ae","Type":"ContainerStarted","Data":"22f4353e1efdf3728206ef7b2fdbe93a5360cb37f9ac16da94d33523c6804749"} Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.865509 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" event={"ID":"f7b11f26-7ebf-493c-abe5-e1f792a977ae","Type":"ContainerStarted","Data":"35ddb0d329cdf3bcb5eac7534a4ecdca365fe072623ffcf8f85ae8e973b73ccf"} Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.867060 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.874713 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.876297 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" event={"ID":"87e3dca3-6e6d-4de7-a436-72606c4d9ab1","Type":"ContainerStarted","Data":"b64f48cbaef4e6975eb0a3718ae575780b1ffbfdc0a392ce02715fea86e3a3f5"} Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.876334 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" event={"ID":"87e3dca3-6e6d-4de7-a436-72606c4d9ab1","Type":"ContainerStarted","Data":"e83b59b1aaaca03209f394ea8dcaa2f14f3719e744c20b811be62184a12f70e4"} Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.876578 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.909827 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" podStartSLOduration=3.909793514 podStartE2EDuration="3.909793514s" podCreationTimestamp="2026-03-18 13:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:51.899184195 +0000 UTC m=+140.358611640" watchObservedRunningTime="2026-03-18 13:04:51.909793514 +0000 UTC m=+140.369220949" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.911326 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.911760 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f" event={"ID":"d5f844c0-ca4c-4097-bedd-bbb4323cc717","Type":"ContainerDied","Data":"a3727f6e0040dfbd1fb6d3c42a7a69289d42b9fcd627cf738086520a5d5af730"} Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.911815 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3727f6e0040dfbd1fb6d3c42a7a69289d42b9fcd627cf738086520a5d5af730" Mar 18 13:04:51 crc kubenswrapper[4912]: I0318 13:04:51.924893 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" podStartSLOduration=3.924871734 podStartE2EDuration="3.924871734s" podCreationTimestamp="2026-03-18 13:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:04:51.915630383 +0000 UTC m=+140.375057818" watchObservedRunningTime="2026-03-18 13:04:51.924871734 +0000 UTC m=+140.384299209" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.462348 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:52 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:52 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:52 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.462431 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.503132 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.747617 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.761585 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.871338 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir\") pod \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.871413 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access\") pod \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\" (UID: \"15fbed0f-e4cb-400c-8391-a5985bcbd76d\") " Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.871478 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access\") pod \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.871502 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir\") pod \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\" (UID: \"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd\") " Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.872025 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" (UID: "1d6fd29e-cd82-47d7-8ae0-91672a9e25bd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.872115 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15fbed0f-e4cb-400c-8391-a5985bcbd76d" (UID: "15fbed0f-e4cb-400c-8391-a5985bcbd76d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.910373 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" (UID: "1d6fd29e-cd82-47d7-8ae0-91672a9e25bd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.910403 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15fbed0f-e4cb-400c-8391-a5985bcbd76d" (UID: "15fbed0f-e4cb-400c-8391-a5985bcbd76d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.973548 4912 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.973590 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15fbed0f-e4cb-400c-8391-a5985bcbd76d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.973607 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.973618 4912 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6fd29e-cd82-47d7-8ae0-91672a9e25bd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.999236 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1d6fd29e-cd82-47d7-8ae0-91672a9e25bd","Type":"ContainerDied","Data":"93e76d5208dbb6223b405bf13a9b1001e11ccc939a5e75d563744bc0539c5150"} Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.999299 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e76d5208dbb6223b405bf13a9b1001e11ccc939a5e75d563744bc0539c5150" Mar 18 13:04:52 crc kubenswrapper[4912]: I0318 13:04:52.999575 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 13:04:53 crc kubenswrapper[4912]: I0318 13:04:53.015852 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 13:04:53 crc kubenswrapper[4912]: I0318 13:04:53.015918 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15fbed0f-e4cb-400c-8391-a5985bcbd76d","Type":"ContainerDied","Data":"9847277f5a15eb09c405f97b10d0a7c6cd58d32b021ceb248268af88f602b28c"} Mar 18 13:04:53 crc kubenswrapper[4912]: I0318 13:04:53.015954 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9847277f5a15eb09c405f97b10d0a7c6cd58d32b021ceb248268af88f602b28c" Mar 18 13:04:53 crc kubenswrapper[4912]: I0318 13:04:53.442092 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:53 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:53 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:53 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:53 crc kubenswrapper[4912]: I0318 13:04:53.442173 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:54 crc kubenswrapper[4912]: I0318 13:04:54.260580 4912 ???:1] "http: TLS handshake error from 192.168.126.11:53722: no serving certificate available for the kubelet" Mar 18 13:04:54 crc kubenswrapper[4912]: I0318 13:04:54.440591 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:54 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:54 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:54 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:54 crc kubenswrapper[4912]: I0318 13:04:54.440697 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:54 crc kubenswrapper[4912]: I0318 13:04:54.526774 4912 ???:1] "http: TLS handshake error from 192.168.126.11:53730: no serving certificate available for the kubelet" Mar 18 13:04:55 crc kubenswrapper[4912]: I0318 13:04:55.446401 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:55 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:55 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:55 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:55 crc kubenswrapper[4912]: I0318 13:04:55.446652 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:55 crc kubenswrapper[4912]: I0318 13:04:55.522212 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:55 crc kubenswrapper[4912]: I0318 13:04:55.529676 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" Mar 18 13:04:56 crc kubenswrapper[4912]: I0318 13:04:56.441318 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:56 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:56 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:56 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:56 crc kubenswrapper[4912]: I0318 13:04:56.441388 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:57 crc kubenswrapper[4912]: I0318 13:04:57.441551 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:57 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:57 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:57 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:57 crc kubenswrapper[4912]: I0318 13:04:57.441641 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:57 crc kubenswrapper[4912]: E0318 13:04:57.844960 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:57 crc kubenswrapper[4912]: E0318 13:04:57.854079 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:57 crc kubenswrapper[4912]: E0318 13:04:57.860765 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:04:57 crc kubenswrapper[4912]: E0318 13:04:57.860846 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:04:58 crc kubenswrapper[4912]: I0318 13:04:58.442473 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:58 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:58 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:58 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:58 crc kubenswrapper[4912]: I0318 13:04:58.443374 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:04:59 crc kubenswrapper[4912]: I0318 13:04:59.440353 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:04:59 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:04:59 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:04:59 crc kubenswrapper[4912]: healthz check failed Mar 18 13:04:59 crc kubenswrapper[4912]: I0318 13:04:59.440478 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.052557 4912 patch_prober.go:28] interesting pod/console-f9d7485db-vpn9z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.052642 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vpn9z" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.253821 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.312129 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2ghlz" Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.329812 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.329788459 podStartE2EDuration="329.788459ms" podCreationTimestamp="2026-03-18 13:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:00.327132837 +0000 UTC m=+148.786560282" watchObservedRunningTime="2026-03-18 13:05:00.329788459 +0000 UTC m=+148.789215884" Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.441662 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:05:00 crc kubenswrapper[4912]: [-]has-synced failed: reason withheld Mar 18 13:05:00 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:05:00 crc kubenswrapper[4912]: healthz check failed Mar 18 13:05:00 crc kubenswrapper[4912]: I0318 13:05:00.441779 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:05:01 crc kubenswrapper[4912]: I0318 13:05:01.441420 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 13:05:01 crc kubenswrapper[4912]: [+]has-synced ok Mar 18 13:05:01 crc kubenswrapper[4912]: [+]process-running ok Mar 18 13:05:01 crc kubenswrapper[4912]: healthz check failed Mar 18 13:05:01 crc kubenswrapper[4912]: I0318 13:05:01.441907 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 13:05:02 crc kubenswrapper[4912]: I0318 13:05:02.441681 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:05:02 crc kubenswrapper[4912]: I0318 13:05:02.444362 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 13:05:06 crc kubenswrapper[4912]: I0318 13:05:06.714853 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:05:06 crc kubenswrapper[4912]: I0318 13:05:06.715795 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" containerID="cri-o://22f4353e1efdf3728206ef7b2fdbe93a5360cb37f9ac16da94d33523c6804749" gracePeriod=30 Mar 18 13:05:06 crc kubenswrapper[4912]: I0318 13:05:06.740640 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:05:06 crc kubenswrapper[4912]: I0318 13:05:06.740869 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" containerID="cri-o://b64f48cbaef4e6975eb0a3718ae575780b1ffbfdc0a392ce02715fea86e3a3f5" gracePeriod=30 Mar 18 13:05:07 crc kubenswrapper[4912]: I0318 13:05:07.577421 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 13:05:07 crc kubenswrapper[4912]: E0318 13:05:07.831168 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:07 crc kubenswrapper[4912]: E0318 13:05:07.833150 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:07 crc kubenswrapper[4912]: E0318 13:05:07.835320 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:07 crc kubenswrapper[4912]: E0318 13:05:07.835449 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:05:08 crc kubenswrapper[4912]: I0318 13:05:08.190059 4912 generic.go:334] "Generic (PLEG): container finished" podID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerID="22f4353e1efdf3728206ef7b2fdbe93a5360cb37f9ac16da94d33523c6804749" exitCode=0 Mar 18 13:05:08 crc kubenswrapper[4912]: I0318 13:05:08.190137 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" event={"ID":"f7b11f26-7ebf-493c-abe5-e1f792a977ae","Type":"ContainerDied","Data":"22f4353e1efdf3728206ef7b2fdbe93a5360cb37f9ac16da94d33523c6804749"} Mar 18 13:05:08 crc kubenswrapper[4912]: I0318 13:05:08.191660 4912 generic.go:334] "Generic (PLEG): container finished" podID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerID="b64f48cbaef4e6975eb0a3718ae575780b1ffbfdc0a392ce02715fea86e3a3f5" exitCode=0 Mar 18 13:05:08 crc kubenswrapper[4912]: I0318 13:05:08.191693 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" event={"ID":"87e3dca3-6e6d-4de7-a436-72606c4d9ab1","Type":"ContainerDied","Data":"b64f48cbaef4e6975eb0a3718ae575780b1ffbfdc0a392ce02715fea86e3a3f5"} Mar 18 13:05:08 crc kubenswrapper[4912]: I0318 13:05:08.999499 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.060463 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.065960 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.535256 4912 patch_prober.go:28] interesting pod/controller-manager-89dffcdf6-zsqrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.535849 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.557775 4912 patch_prober.go:28] interesting pod/route-controller-manager-67b479887-qj89j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 18 13:05:10 crc kubenswrapper[4912]: I0318 13:05:10.557843 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 18 13:05:14 crc kubenswrapper[4912]: I0318 13:05:14.231307 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nvzrd_d456817a-6755-41c0-bf82-bbb3bf4c35fa/kube-multus-additional-cni-plugins/0.log" Mar 18 13:05:14 crc kubenswrapper[4912]: I0318 13:05:14.231698 4912 generic.go:334] "Generic (PLEG): container finished" podID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" exitCode=137 Mar 18 13:05:14 crc kubenswrapper[4912]: I0318 13:05:14.239171 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" event={"ID":"d456817a-6755-41c0-bf82-bbb3bf4c35fa","Type":"ContainerDied","Data":"be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6"} Mar 18 13:05:15 crc kubenswrapper[4912]: I0318 13:05:15.034957 4912 ???:1] "http: TLS handshake error from 192.168.126.11:45898: no serving certificate available for the kubelet" Mar 18 13:05:17 crc kubenswrapper[4912]: E0318 13:05:17.830326 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6 is running failed: container process not found" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:17 crc kubenswrapper[4912]: E0318 13:05:17.833385 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6 is running failed: container process not found" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:17 crc kubenswrapper[4912]: E0318 13:05:17.834103 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6 is running failed: container process not found" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 13:05:17 crc kubenswrapper[4912]: E0318 13:05:17.834151 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:05:20 crc kubenswrapper[4912]: E0318 13:05:20.050673 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 13:05:20 crc kubenswrapper[4912]: E0318 13:05:20.050869 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k5qbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-crcqh_openshift-marketplace(840ef508-c05b-4b3b-bb16-e15729003be1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 13:05:20 crc kubenswrapper[4912]: E0318 13:05:20.052552 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-crcqh" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" Mar 18 13:05:20 crc kubenswrapper[4912]: I0318 13:05:20.211933 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 13:05:21 crc kubenswrapper[4912]: I0318 13:05:21.534840 4912 patch_prober.go:28] interesting pod/controller-manager-89dffcdf6-zsqrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:05:21 crc kubenswrapper[4912]: I0318 13:05:21.535295 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:05:21 crc kubenswrapper[4912]: I0318 13:05:21.558142 4912 patch_prober.go:28] interesting pod/route-controller-manager-67b479887-qj89j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:05:21 crc kubenswrapper[4912]: I0318 13:05:21.558630 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.698648 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-crcqh" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.829373 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.829623 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7rtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gl7cm_openshift-marketplace(3af91020-4095-48f0-9457-b171de576fe0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.830872 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gl7cm" podUID="3af91020-4095-48f0-9457-b171de576fe0" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.844738 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.844924 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mn7gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r9dkt_openshift-marketplace(797e0d01-0e3c-498f-abe9-5c90c0e53215): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 13:05:21 crc kubenswrapper[4912]: E0318 13:05:21.846210 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r9dkt" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.474645 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 13:05:22 crc kubenswrapper[4912]: E0318 13:05:22.475224 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fbed0f-e4cb-400c-8391-a5985bcbd76d" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475239 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fbed0f-e4cb-400c-8391-a5985bcbd76d" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: E0318 13:05:22.475254 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f844c0-ca4c-4097-bedd-bbb4323cc717" containerName="collect-profiles" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475262 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f844c0-ca4c-4097-bedd-bbb4323cc717" containerName="collect-profiles" Mar 18 13:05:22 crc kubenswrapper[4912]: E0318 13:05:22.475277 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475283 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475384 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fbed0f-e4cb-400c-8391-a5985bcbd76d" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475400 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6fd29e-cd82-47d7-8ae0-91672a9e25bd" containerName="pruner" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475409 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f844c0-ca4c-4097-bedd-bbb4323cc717" containerName="collect-profiles" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.475950 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.478269 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.478506 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.478730 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.649433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.649517 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.750956 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.751383 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.751145 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.774828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:22 crc kubenswrapper[4912]: I0318 13:05:22.801858 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.359234 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gl7cm" podUID="3af91020-4095-48f0-9457-b171de576fe0" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.359243 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r9dkt" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.477777 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.478089 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kp5sc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-65p9d_openshift-marketplace(5201881b-c2ba-46b7-aeae-62df63a255e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.479284 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-65p9d" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.665937 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.713623 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.714779 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.714818 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.715092 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" containerName="route-controller-manager" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.715924 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.733455 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.767422 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.767484 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.767569 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.767731 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wfd\" (UniqueName: \"kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.773779 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.793650 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.793895 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d58k4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hj4gk_openshift-marketplace(8c189f26-c791-48a6-a060-9982c8666243): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 13:05:23 crc kubenswrapper[4912]: E0318 13:05:23.795219 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hj4gk" podUID="8c189f26-c791-48a6-a060-9982c8666243" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.810320 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nvzrd_d456817a-6755-41c0-bf82-bbb3bf4c35fa/kube-multus-additional-cni-plugins/0.log" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.810406 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872560 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcn4k\" (UniqueName: \"kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k\") pod \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872641 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir\") pod \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872669 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca\") pod \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872722 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hjf9\" (UniqueName: \"kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9\") pod \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872750 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca\") pod \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872818 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert\") pod \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872882 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles\") pod \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872929 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl75w\" (UniqueName: \"kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w\") pod \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872958 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config\") pod \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\" (UID: \"f7b11f26-7ebf-493c-abe5-e1f792a977ae\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872985 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert\") pod \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.872978 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "d456817a-6755-41c0-bf82-bbb3bf4c35fa" (UID: "d456817a-6755-41c0-bf82-bbb3bf4c35fa"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873022 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready\") pod \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873158 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config\") pod \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\" (UID: \"87e3dca3-6e6d-4de7-a436-72606c4d9ab1\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873201 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist\") pod \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\" (UID: \"d456817a-6755-41c0-bf82-bbb3bf4c35fa\") " Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873485 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873535 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873564 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873617 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77wfd\" (UniqueName: \"kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.873712 4912 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d456817a-6755-41c0-bf82-bbb3bf4c35fa-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.875217 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config" (OuterVolumeSpecName: "config") pod "f7b11f26-7ebf-493c-abe5-e1f792a977ae" (UID: "f7b11f26-7ebf-493c-abe5-e1f792a977ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.875867 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca" (OuterVolumeSpecName: "client-ca") pod "87e3dca3-6e6d-4de7-a436-72606c4d9ab1" (UID: "87e3dca3-6e6d-4de7-a436-72606c4d9ab1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.876395 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f7b11f26-7ebf-493c-abe5-e1f792a977ae" (UID: "f7b11f26-7ebf-493c-abe5-e1f792a977ae"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.877342 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready" (OuterVolumeSpecName: "ready") pod "d456817a-6755-41c0-bf82-bbb3bf4c35fa" (UID: "d456817a-6755-41c0-bf82-bbb3bf4c35fa"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.877893 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.877973 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config" (OuterVolumeSpecName: "config") pod "87e3dca3-6e6d-4de7-a436-72606c4d9ab1" (UID: "87e3dca3-6e6d-4de7-a436-72606c4d9ab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.880213 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.880255 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "f7b11f26-7ebf-493c-abe5-e1f792a977ae" (UID: "f7b11f26-7ebf-493c-abe5-e1f792a977ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.881570 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "d456817a-6755-41c0-bf82-bbb3bf4c35fa" (UID: "d456817a-6755-41c0-bf82-bbb3bf4c35fa"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.883304 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.883789 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w" (OuterVolumeSpecName: "kube-api-access-nl75w") pod "d456817a-6755-41c0-bf82-bbb3bf4c35fa" (UID: "d456817a-6755-41c0-bf82-bbb3bf4c35fa"). InnerVolumeSpecName "kube-api-access-nl75w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.883961 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k" (OuterVolumeSpecName: "kube-api-access-bcn4k") pod "87e3dca3-6e6d-4de7-a436-72606c4d9ab1" (UID: "87e3dca3-6e6d-4de7-a436-72606c4d9ab1"). InnerVolumeSpecName "kube-api-access-bcn4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.885397 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87e3dca3-6e6d-4de7-a436-72606c4d9ab1" (UID: "87e3dca3-6e6d-4de7-a436-72606c4d9ab1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.891426 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9" (OuterVolumeSpecName: "kube-api-access-7hjf9") pod "f7b11f26-7ebf-493c-abe5-e1f792a977ae" (UID: "f7b11f26-7ebf-493c-abe5-e1f792a977ae"). InnerVolumeSpecName "kube-api-access-7hjf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.902721 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f7b11f26-7ebf-493c-abe5-e1f792a977ae" (UID: "f7b11f26-7ebf-493c-abe5-e1f792a977ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.905672 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wfd\" (UniqueName: \"kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd\") pod \"route-controller-manager-659d7b9f67-x7bj6\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974414 4912 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d456817a-6755-41c0-bf82-bbb3bf4c35fa-ready\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974447 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974459 4912 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d456817a-6755-41c0-bf82-bbb3bf4c35fa-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974470 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcn4k\" (UniqueName: \"kubernetes.io/projected/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-kube-api-access-bcn4k\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974479 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974488 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974497 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hjf9\" (UniqueName: \"kubernetes.io/projected/f7b11f26-7ebf-493c-abe5-e1f792a977ae-kube-api-access-7hjf9\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974505 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b11f26-7ebf-493c-abe5-e1f792a977ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974515 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974524 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl75w\" (UniqueName: \"kubernetes.io/projected/d456817a-6755-41c0-bf82-bbb3bf4c35fa-kube-api-access-nl75w\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974532 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b11f26-7ebf-493c-abe5-e1f792a977ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.974540 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e3dca3-6e6d-4de7-a436-72606c4d9ab1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:23 crc kubenswrapper[4912]: I0318 13:05:23.987013 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.070714 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.298737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec","Type":"ContainerStarted","Data":"f60b83c89c0c79483ba71a55d191baccea1ed7c1b4677836188147dfadeaf2c6"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.302905 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" event={"ID":"87e3dca3-6e6d-4de7-a436-72606c4d9ab1","Type":"ContainerDied","Data":"e83b59b1aaaca03209f394ea8dcaa2f14f3719e744c20b811be62184a12f70e4"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.302996 4912 scope.go:117] "RemoveContainer" containerID="b64f48cbaef4e6975eb0a3718ae575780b1ffbfdc0a392ce02715fea86e3a3f5" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.303196 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b479887-qj89j" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.305357 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nvzrd_d456817a-6755-41c0-bf82-bbb3bf4c35fa/kube-multus-additional-cni-plugins/0.log" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.305561 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" event={"ID":"d456817a-6755-41c0-bf82-bbb3bf4c35fa","Type":"ContainerDied","Data":"04ab0faf81bfa3db548e29800172ff12afdf3814ca1b074b07f824e2e0f9c2f1"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.306200 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nvzrd" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.317105 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerStarted","Data":"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.319281 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerStarted","Data":"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.328369 4912 scope.go:117] "RemoveContainer" containerID="be6a4f565de8589a021394b348c635d61c89fa7af4cb368cf0b13598ecc6cdd6" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.335818 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.335866 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerStarted","Data":"d3dc23f1768beb0da6ad9cff157c22c6801e5e018cfaa8656a5b764ac9c514e1"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.339856 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.341109 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" event={"ID":"f7b11f26-7ebf-493c-abe5-e1f792a977ae","Type":"ContainerDied","Data":"35ddb0d329cdf3bcb5eac7534a4ecdca365fe072623ffcf8f85ae8e973b73ccf"} Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.341975 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-89dffcdf6-zsqrh" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.347085 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b479887-qj89j"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.350283 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nvzrd"] Mar 18 13:05:24 crc kubenswrapper[4912]: E0318 13:05:24.351365 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-65p9d" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" Mar 18 13:05:24 crc kubenswrapper[4912]: E0318 13:05:24.351392 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hj4gk" podUID="8c189f26-c791-48a6-a060-9982c8666243" Mar 18 13:05:24 crc kubenswrapper[4912]: W0318 13:05:24.352818 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23963a4_2bdb_4f75_a1b2_713fdea2d32b.slice/crio-00a4457ed0826e883e334e0495286728956a6a655fdc72a4d08e987fe4908e54 WatchSource:0}: Error finding container 00a4457ed0826e883e334e0495286728956a6a655fdc72a4d08e987fe4908e54: Status 404 returned error can't find the container with id 00a4457ed0826e883e334e0495286728956a6a655fdc72a4d08e987fe4908e54 Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.368912 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nvzrd"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.370347 4912 scope.go:117] "RemoveContainer" containerID="22f4353e1efdf3728206ef7b2fdbe93a5360cb37f9ac16da94d33523c6804749" Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.454787 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:05:24 crc kubenswrapper[4912]: I0318 13:05:24.458872 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-89dffcdf6-zsqrh"] Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.360874 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" event={"ID":"e23963a4-2bdb-4f75-a1b2-713fdea2d32b","Type":"ContainerStarted","Data":"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.361438 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" event={"ID":"e23963a4-2bdb-4f75-a1b2-713fdea2d32b","Type":"ContainerStarted","Data":"00a4457ed0826e883e334e0495286728956a6a655fdc72a4d08e987fe4908e54"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.362350 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.367960 4912 generic.go:334] "Generic (PLEG): container finished" podID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerID="b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7" exitCode=0 Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.368028 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerDied","Data":"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.371600 4912 generic.go:334] "Generic (PLEG): container finished" podID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerID="d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77" exitCode=0 Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.371688 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerDied","Data":"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.374031 4912 generic.go:334] "Generic (PLEG): container finished" podID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerID="d3dc23f1768beb0da6ad9cff157c22c6801e5e018cfaa8656a5b764ac9c514e1" exitCode=0 Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.374131 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerDied","Data":"d3dc23f1768beb0da6ad9cff157c22c6801e5e018cfaa8656a5b764ac9c514e1"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.380330 4912 generic.go:334] "Generic (PLEG): container finished" podID="77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" containerID="9ddb916a3fcfb16c0b483b0f7e949c1cb7c4692510f49c29b13a1a2ade3cdc3d" exitCode=0 Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.380382 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec","Type":"ContainerDied","Data":"9ddb916a3fcfb16c0b483b0f7e949c1cb7c4692510f49c29b13a1a2ade3cdc3d"} Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.389110 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.389841 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" podStartSLOduration=19.389826308 podStartE2EDuration="19.389826308s" podCreationTimestamp="2026-03-18 13:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:25.38478479 +0000 UTC m=+173.844212235" watchObservedRunningTime="2026-03-18 13:05:25.389826308 +0000 UTC m=+173.849253733" Mar 18 13:05:25 crc kubenswrapper[4912]: I0318 13:05:25.631598 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.218407 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:26 crc kubenswrapper[4912]: E0318 13:05:26.218887 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.218987 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" Mar 18 13:05:26 crc kubenswrapper[4912]: E0318 13:05:26.219080 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.219205 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.219750 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" containerName="controller-manager" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.219826 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" containerName="kube-multus-additional-cni-plugins" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.222719 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.231352 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.231825 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.232539 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.232623 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.232726 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.232790 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.235825 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e3dca3-6e6d-4de7-a436-72606c4d9ab1" path="/var/lib/kubelet/pods/87e3dca3-6e6d-4de7-a436-72606c4d9ab1/volumes" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.236388 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d456817a-6755-41c0-bf82-bbb3bf4c35fa" path="/var/lib/kubelet/pods/d456817a-6755-41c0-bf82-bbb3bf4c35fa/volumes" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.237106 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b11f26-7ebf-493c-abe5-e1f792a977ae" path="/var/lib/kubelet/pods/f7b11f26-7ebf-493c-abe5-e1f792a977ae/volumes" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.237422 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.237589 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.310489 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.310581 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.310666 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.310740 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktz2\" (UniqueName: \"kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.310768 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.412967 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.413577 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.413617 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.413647 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktz2\" (UniqueName: \"kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.413681 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.415748 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.417460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.419480 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.426318 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.442483 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktz2\" (UniqueName: \"kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2\") pod \"controller-manager-655d9fbc5f-6k6ll\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.557426 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.632147 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.730481 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir\") pod \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.730930 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access\") pod \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\" (UID: \"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec\") " Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.737174 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" (UID: "77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.753248 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" (UID: "77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.817831 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.832846 4912 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.832892 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:26 crc kubenswrapper[4912]: I0318 13:05:26.905117 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.117845 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:27 crc kubenswrapper[4912]: W0318 13:05:27.134235 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb839ced0_e4b0_4389_b77c_4f335d33f40a.slice/crio-7172da9d29da03f6d46b40bb910357099641d435a4246435f282c3de7c16e8e4 WatchSource:0}: Error finding container 7172da9d29da03f6d46b40bb910357099641d435a4246435f282c3de7c16e8e4: Status 404 returned error can't find the container with id 7172da9d29da03f6d46b40bb910357099641d435a4246435f282c3de7c16e8e4 Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.400876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerStarted","Data":"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82"} Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.404033 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" event={"ID":"b839ced0-e4b0-4389-b77c-4f335d33f40a","Type":"ContainerStarted","Data":"7172da9d29da03f6d46b40bb910357099641d435a4246435f282c3de7c16e8e4"} Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.408005 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerStarted","Data":"908a121d5f7ab1fc6443809189516caf529fdc79bd633bc2ca417358e4083120"} Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.411864 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec","Type":"ContainerDied","Data":"f60b83c89c0c79483ba71a55d191baccea1ed7c1b4677836188147dfadeaf2c6"} Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.411916 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60b83c89c0c79483ba71a55d191baccea1ed7c1b4677836188147dfadeaf2c6" Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.411997 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.425972 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerStarted","Data":"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056"} Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.434920 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2x6x" podStartSLOduration=3.007963886 podStartE2EDuration="38.434894437s" podCreationTimestamp="2026-03-18 13:04:49 +0000 UTC" firstStartedPulling="2026-03-18 13:04:50.837455884 +0000 UTC m=+139.296883309" lastFinishedPulling="2026-03-18 13:05:26.264386435 +0000 UTC m=+174.723813860" observedRunningTime="2026-03-18 13:05:27.428855942 +0000 UTC m=+175.888283387" watchObservedRunningTime="2026-03-18 13:05:27.434894437 +0000 UTC m=+175.894321862" Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.454689 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6ggq" podStartSLOduration=4.620359673 podStartE2EDuration="41.454643184s" podCreationTimestamp="2026-03-18 13:04:46 +0000 UTC" firstStartedPulling="2026-03-18 13:04:49.525555772 +0000 UTC m=+137.984983197" lastFinishedPulling="2026-03-18 13:05:26.359839283 +0000 UTC m=+174.819266708" observedRunningTime="2026-03-18 13:05:27.451290373 +0000 UTC m=+175.910717808" watchObservedRunningTime="2026-03-18 13:05:27.454643184 +0000 UTC m=+175.914070609" Mar 18 13:05:27 crc kubenswrapper[4912]: I0318 13:05:27.477653 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4tf6g" podStartSLOduration=4.910335927 podStartE2EDuration="41.47762854s" podCreationTimestamp="2026-03-18 13:04:46 +0000 UTC" firstStartedPulling="2026-03-18 13:04:49.64818841 +0000 UTC m=+138.107615835" lastFinishedPulling="2026-03-18 13:05:26.215481023 +0000 UTC m=+174.674908448" observedRunningTime="2026-03-18 13:05:27.476473329 +0000 UTC m=+175.935900754" watchObservedRunningTime="2026-03-18 13:05:27.47762854 +0000 UTC m=+175.937055975" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.432538 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" event={"ID":"b839ced0-e4b0-4389-b77c-4f335d33f40a","Type":"ContainerStarted","Data":"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271"} Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.432559 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" podUID="b839ced0-e4b0-4389-b77c-4f335d33f40a" containerName="controller-manager" containerID="cri-o://9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271" gracePeriod=30 Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.433374 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.433473 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" podUID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" containerName="route-controller-manager" containerID="cri-o://7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981" gracePeriod=30 Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.445083 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.464762 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" podStartSLOduration=22.46472749 podStartE2EDuration="22.46472749s" podCreationTimestamp="2026-03-18 13:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:28.461976276 +0000 UTC m=+176.921403731" watchObservedRunningTime="2026-03-18 13:05:28.46472749 +0000 UTC m=+176.924154935" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.920736 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.927576 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.965687 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:28 crc kubenswrapper[4912]: E0318 13:05:28.966471 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b839ced0-e4b0-4389-b77c-4f335d33f40a" containerName="controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966497 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b839ced0-e4b0-4389-b77c-4f335d33f40a" containerName="controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: E0318 13:05:28.966512 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" containerName="route-controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966523 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" containerName="route-controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: E0318 13:05:28.966545 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" containerName="pruner" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966555 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" containerName="pruner" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966863 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="77d511b1-4a45-4cc2-a4b6-3bf14b7fa5ec" containerName="pruner" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966899 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" containerName="route-controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.966915 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b839ced0-e4b0-4389-b77c-4f335d33f40a" containerName="controller-manager" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968135 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca\") pod \"b839ced0-e4b0-4389-b77c-4f335d33f40a\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968262 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config\") pod \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968339 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles\") pod \"b839ced0-e4b0-4389-b77c-4f335d33f40a\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968407 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config\") pod \"b839ced0-e4b0-4389-b77c-4f335d33f40a\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968473 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert\") pod \"b839ced0-e4b0-4389-b77c-4f335d33f40a\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968525 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert\") pod \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968559 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca\") pod \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968627 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77wfd\" (UniqueName: \"kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd\") pod \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\" (UID: \"e23963a4-2bdb-4f75-a1b2-713fdea2d32b\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.968656 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qktz2\" (UniqueName: \"kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2\") pod \"b839ced0-e4b0-4389-b77c-4f335d33f40a\" (UID: \"b839ced0-e4b0-4389-b77c-4f335d33f40a\") " Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.971805 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.973610 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config" (OuterVolumeSpecName: "config") pod "b839ced0-e4b0-4389-b77c-4f335d33f40a" (UID: "b839ced0-e4b0-4389-b77c-4f335d33f40a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.974278 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca" (OuterVolumeSpecName: "client-ca") pod "b839ced0-e4b0-4389-b77c-4f335d33f40a" (UID: "b839ced0-e4b0-4389-b77c-4f335d33f40a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.974471 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config" (OuterVolumeSpecName: "config") pod "e23963a4-2bdb-4f75-a1b2-713fdea2d32b" (UID: "e23963a4-2bdb-4f75-a1b2-713fdea2d32b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.976245 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e23963a4-2bdb-4f75-a1b2-713fdea2d32b" (UID: "e23963a4-2bdb-4f75-a1b2-713fdea2d32b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.977563 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2" (OuterVolumeSpecName: "kube-api-access-qktz2") pod "b839ced0-e4b0-4389-b77c-4f335d33f40a" (UID: "b839ced0-e4b0-4389-b77c-4f335d33f40a"). InnerVolumeSpecName "kube-api-access-qktz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.978825 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b839ced0-e4b0-4389-b77c-4f335d33f40a" (UID: "b839ced0-e4b0-4389-b77c-4f335d33f40a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.993751 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e23963a4-2bdb-4f75-a1b2-713fdea2d32b" (UID: "e23963a4-2bdb-4f75-a1b2-713fdea2d32b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.993853 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd" (OuterVolumeSpecName: "kube-api-access-77wfd") pod "e23963a4-2bdb-4f75-a1b2-713fdea2d32b" (UID: "e23963a4-2bdb-4f75-a1b2-713fdea2d32b"). InnerVolumeSpecName "kube-api-access-77wfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:28 crc kubenswrapper[4912]: I0318 13:05:28.996246 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b839ced0-e4b0-4389-b77c-4f335d33f40a" (UID: "b839ced0-e4b0-4389-b77c-4f335d33f40a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.007178 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.070822 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshgh\" (UniqueName: \"kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.070902 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.070932 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071132 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071194 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071244 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77wfd\" (UniqueName: \"kubernetes.io/projected/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-kube-api-access-77wfd\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071256 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qktz2\" (UniqueName: \"kubernetes.io/projected/b839ced0-e4b0-4389-b77c-4f335d33f40a-kube-api-access-qktz2\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071269 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071282 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071292 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071302 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b839ced0-e4b0-4389-b77c-4f335d33f40a-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071311 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b839ced0-e4b0-4389-b77c-4f335d33f40a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071319 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.071329 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23963a4-2bdb-4f75-a1b2-713fdea2d32b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.173056 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.173174 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.173212 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshgh\" (UniqueName: \"kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.173306 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.174023 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.174970 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.175097 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.175132 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.178993 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.197156 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshgh\" (UniqueName: \"kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh\") pod \"controller-manager-68b4975746-rrxsl\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.292307 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.432387 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.432473 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.452173 4912 generic.go:334] "Generic (PLEG): container finished" podID="b839ced0-e4b0-4389-b77c-4f335d33f40a" containerID="9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271" exitCode=0 Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.452337 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.453177 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" event={"ID":"b839ced0-e4b0-4389-b77c-4f335d33f40a","Type":"ContainerDied","Data":"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271"} Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.453253 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll" event={"ID":"b839ced0-e4b0-4389-b77c-4f335d33f40a","Type":"ContainerDied","Data":"7172da9d29da03f6d46b40bb910357099641d435a4246435f282c3de7c16e8e4"} Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.453275 4912 scope.go:117] "RemoveContainer" containerID="9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.471508 4912 generic.go:334] "Generic (PLEG): container finished" podID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" containerID="7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981" exitCode=0 Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.472296 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.476251 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" event={"ID":"e23963a4-2bdb-4f75-a1b2-713fdea2d32b","Type":"ContainerDied","Data":"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981"} Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.476367 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6" event={"ID":"e23963a4-2bdb-4f75-a1b2-713fdea2d32b","Type":"ContainerDied","Data":"00a4457ed0826e883e334e0495286728956a6a655fdc72a4d08e987fe4908e54"} Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.476740 4912 scope.go:117] "RemoveContainer" containerID="9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271" Mar 18 13:05:29 crc kubenswrapper[4912]: E0318 13:05:29.479836 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271\": container with ID starting with 9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271 not found: ID does not exist" containerID="9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.479909 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271"} err="failed to get container status \"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271\": rpc error: code = NotFound desc = could not find container \"9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271\": container with ID starting with 9b124e45934fe852e8a40089987c8d7e07fd201d729ce857fbbcae6aa8b08271 not found: ID does not exist" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.479962 4912 scope.go:117] "RemoveContainer" containerID="7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.485823 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.488623 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-655d9fbc5f-6k6ll"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.497198 4912 scope.go:117] "RemoveContainer" containerID="7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981" Mar 18 13:05:29 crc kubenswrapper[4912]: E0318 13:05:29.497578 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981\": container with ID starting with 7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981 not found: ID does not exist" containerID="7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.497604 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981"} err="failed to get container status \"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981\": rpc error: code = NotFound desc = could not find container \"7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981\": container with ID starting with 7c1080b3bfa0ec54137d9a7661c8af127d0dc9a2578b5ac64afea3b55d163981 not found: ID does not exist" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.512981 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.516438 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659d7b9f67-x7bj6"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.859116 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.860505 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.862972 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.864309 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.873330 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.893166 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.893242 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.893291 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.994760 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.994842 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.994983 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.994993 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:29 crc kubenswrapper[4912]: I0318 13:05:29.995145 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.022012 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access\") pod \"installer-9-crc\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.032150 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.183242 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.236954 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b839ced0-e4b0-4389-b77c-4f335d33f40a" path="/var/lib/kubelet/pods/b839ced0-e4b0-4389-b77c-4f335d33f40a/volumes" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.237695 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23963a4-2bdb-4f75-a1b2-713fdea2d32b" path="/var/lib/kubelet/pods/e23963a4-2bdb-4f75-a1b2-713fdea2d32b/volumes" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.480517 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" event={"ID":"e25e09ba-054e-4825-8e1f-a810ddbc9444","Type":"ContainerStarted","Data":"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993"} Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.481073 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" event={"ID":"e25e09ba-054e-4825-8e1f-a810ddbc9444","Type":"ContainerStarted","Data":"5978d3ed957359a715aa0204613ba01269a33704d8222573ad60a48c707b6d6e"} Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.481097 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.487918 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.504517 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" podStartSLOduration=4.504486696 podStartE2EDuration="4.504486696s" podCreationTimestamp="2026-03-18 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:30.500935689 +0000 UTC m=+178.960363134" watchObservedRunningTime="2026-03-18 13:05:30.504486696 +0000 UTC m=+178.963914121" Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.668873 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 13:05:30 crc kubenswrapper[4912]: W0318 13:05:30.681951 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcfc460f0_40a2_4a51_a361_d29032b64e15.slice/crio-42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43 WatchSource:0}: Error finding container 42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43: Status 404 returned error can't find the container with id 42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43 Mar 18 13:05:30 crc kubenswrapper[4912]: I0318 13:05:30.886392 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n2x6x" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="registry-server" probeResult="failure" output=< Mar 18 13:05:30 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:05:30 crc kubenswrapper[4912]: > Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.215935 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.216905 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.220133 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.224300 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.224674 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.224911 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.225925 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.226586 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.244320 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.318751 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cr4b\" (UniqueName: \"kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.318829 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.318866 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.318902 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.420189 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cr4b\" (UniqueName: \"kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.420244 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.420272 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.420297 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.421782 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.421930 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.441180 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.445892 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cr4b\" (UniqueName: \"kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b\") pod \"route-controller-manager-69f7d7d65f-8k6vv\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.497512 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cfc460f0-40a2-4a51-a361-d29032b64e15","Type":"ContainerStarted","Data":"5a8694b9c0aabfb5c5182aad4f39857090dddcd964722fff5ad2c6c0265287b6"} Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.497599 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cfc460f0-40a2-4a51-a361-d29032b64e15","Type":"ContainerStarted","Data":"42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43"} Mar 18 13:05:31 crc kubenswrapper[4912]: I0318 13:05:31.545944 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.039988 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:32 crc kubenswrapper[4912]: W0318 13:05:32.057263 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a4e8f4_395b_47b5_b52d_0035c6342061.slice/crio-a68520c0f504c23e296f989d9c87a17f31102025da89ac5bf880bd19296de345 WatchSource:0}: Error finding container a68520c0f504c23e296f989d9c87a17f31102025da89ac5bf880bd19296de345: Status 404 returned error can't find the container with id a68520c0f504c23e296f989d9c87a17f31102025da89ac5bf880bd19296de345 Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.512938 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" event={"ID":"00a4e8f4-395b-47b5-b52d-0035c6342061","Type":"ContainerStarted","Data":"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba"} Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.513506 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" event={"ID":"00a4e8f4-395b-47b5-b52d-0035c6342061","Type":"ContainerStarted","Data":"a68520c0f504c23e296f989d9c87a17f31102025da89ac5bf880bd19296de345"} Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.513989 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.585925 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.585903026 podStartE2EDuration="3.585903026s" podCreationTimestamp="2026-03-18 13:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:32.546216456 +0000 UTC m=+181.005643911" watchObservedRunningTime="2026-03-18 13:05:32.585903026 +0000 UTC m=+181.045330451" Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.587271 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" podStartSLOduration=6.587265804 podStartE2EDuration="6.587265804s" podCreationTimestamp="2026-03-18 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:32.578877575 +0000 UTC m=+181.038305010" watchObservedRunningTime="2026-03-18 13:05:32.587265804 +0000 UTC m=+181.046693229" Mar 18 13:05:32 crc kubenswrapper[4912]: I0318 13:05:32.632777 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.472008 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.472726 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.530799 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.592993 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.824806 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.825345 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:36 crc kubenswrapper[4912]: I0318 13:05:36.864759 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:37 crc kubenswrapper[4912]: I0318 13:05:37.593516 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:38 crc kubenswrapper[4912]: I0318 13:05:38.165752 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:05:39 crc kubenswrapper[4912]: I0318 13:05:39.481752 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:05:39 crc kubenswrapper[4912]: I0318 13:05:39.534449 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:05:39 crc kubenswrapper[4912]: I0318 13:05:39.555692 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4tf6g" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="registry-server" containerID="cri-o://908a121d5f7ab1fc6443809189516caf529fdc79bd633bc2ca417358e4083120" gracePeriod=2 Mar 18 13:05:40 crc kubenswrapper[4912]: I0318 13:05:40.562790 4912 generic.go:334] "Generic (PLEG): container finished" podID="3af91020-4095-48f0-9457-b171de576fe0" containerID="2c9f84ac8a9e710c3a8dbd2ffe63ffdcd56bfabeee96dab9956ccc5492389e73" exitCode=0 Mar 18 13:05:40 crc kubenswrapper[4912]: I0318 13:05:40.563158 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerDied","Data":"2c9f84ac8a9e710c3a8dbd2ffe63ffdcd56bfabeee96dab9956ccc5492389e73"} Mar 18 13:05:40 crc kubenswrapper[4912]: I0318 13:05:40.569521 4912 generic.go:334] "Generic (PLEG): container finished" podID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerID="908a121d5f7ab1fc6443809189516caf529fdc79bd633bc2ca417358e4083120" exitCode=0 Mar 18 13:05:40 crc kubenswrapper[4912]: I0318 13:05:40.569583 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerDied","Data":"908a121d5f7ab1fc6443809189516caf529fdc79bd633bc2ca417358e4083120"} Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.154776 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.196541 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities\") pod \"ba477a7f-ee05-44cf-be52-4c67c7a50192\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.196661 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content\") pod \"ba477a7f-ee05-44cf-be52-4c67c7a50192\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.196743 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6tf9\" (UniqueName: \"kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9\") pod \"ba477a7f-ee05-44cf-be52-4c67c7a50192\" (UID: \"ba477a7f-ee05-44cf-be52-4c67c7a50192\") " Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.197700 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities" (OuterVolumeSpecName: "utilities") pod "ba477a7f-ee05-44cf-be52-4c67c7a50192" (UID: "ba477a7f-ee05-44cf-be52-4c67c7a50192"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.204718 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9" (OuterVolumeSpecName: "kube-api-access-t6tf9") pod "ba477a7f-ee05-44cf-be52-4c67c7a50192" (UID: "ba477a7f-ee05-44cf-be52-4c67c7a50192"). InnerVolumeSpecName "kube-api-access-t6tf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.258758 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba477a7f-ee05-44cf-be52-4c67c7a50192" (UID: "ba477a7f-ee05-44cf-be52-4c67c7a50192"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.298006 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.298054 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba477a7f-ee05-44cf-be52-4c67c7a50192-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.298075 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6tf9\" (UniqueName: \"kubernetes.io/projected/ba477a7f-ee05-44cf-be52-4c67c7a50192-kube-api-access-t6tf9\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.579266 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tf6g" event={"ID":"ba477a7f-ee05-44cf-be52-4c67c7a50192","Type":"ContainerDied","Data":"f1524a86a7771dceff22d226fb2b514baf25051b456a05e49d115530876d5465"} Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.580518 4912 scope.go:117] "RemoveContainer" containerID="908a121d5f7ab1fc6443809189516caf529fdc79bd633bc2ca417358e4083120" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.579353 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tf6g" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.598678 4912 scope.go:117] "RemoveContainer" containerID="d3dc23f1768beb0da6ad9cff157c22c6801e5e018cfaa8656a5b764ac9c514e1" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.630384 4912 scope.go:117] "RemoveContainer" containerID="40379c5a2d8a39c404ad21e27ef3c63e96a5a8d92f75d1b7c86780fb2c54cccd" Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.631239 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:05:41 crc kubenswrapper[4912]: I0318 13:05:41.638592 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4tf6g"] Mar 18 13:05:42 crc kubenswrapper[4912]: I0318 13:05:42.236734 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" path="/var/lib/kubelet/pods/ba477a7f-ee05-44cf-be52-4c67c7a50192/volumes" Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.611747 4912 generic.go:334] "Generic (PLEG): container finished" podID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerID="8121e02888c8a512f9c7f174bbd8b1b6ddd56da564624e9a6c1b2b34eb04703c" exitCode=0 Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.611827 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerDied","Data":"8121e02888c8a512f9c7f174bbd8b1b6ddd56da564624e9a6c1b2b34eb04703c"} Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.619271 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerStarted","Data":"ced6c6ba4a8f9bebe479004ec598cb528f05cca9587ef5c8e65b1baf64e7f19d"} Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.624493 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerStarted","Data":"87efa23e646dc6a554baaaa7f2d3e352bc64777379eee855fcd5db87b03429f1"} Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.627825 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c189f26-c791-48a6-a060-9982c8666243" containerID="89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730" exitCode=0 Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.627917 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerDied","Data":"89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730"} Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.630189 4912 generic.go:334] "Generic (PLEG): container finished" podID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerID="2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35" exitCode=0 Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.630317 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerDied","Data":"2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35"} Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.735659 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gl7cm" podStartSLOduration=4.260608291 podStartE2EDuration="1m0.735637352s" podCreationTimestamp="2026-03-18 13:04:46 +0000 UTC" firstStartedPulling="2026-03-18 13:04:49.556277508 +0000 UTC m=+138.015704933" lastFinishedPulling="2026-03-18 13:05:46.031306569 +0000 UTC m=+194.490733994" observedRunningTime="2026-03-18 13:05:46.729476434 +0000 UTC m=+195.188903879" watchObservedRunningTime="2026-03-18 13:05:46.735637352 +0000 UTC m=+195.195064777" Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.753830 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.754149 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" podUID="e25e09ba-054e-4825-8e1f-a810ddbc9444" containerName="controller-manager" containerID="cri-o://606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993" gracePeriod=30 Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.775550 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:46 crc kubenswrapper[4912]: I0318 13:05:46.775816 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" podUID="00a4e8f4-395b-47b5-b52d-0035c6342061" containerName="route-controller-manager" containerID="cri-o://b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba" gracePeriod=30 Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.013375 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.013763 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.375862 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.487918 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert\") pod \"00a4e8f4-395b-47b5-b52d-0035c6342061\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.488003 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config\") pod \"00a4e8f4-395b-47b5-b52d-0035c6342061\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.488064 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cr4b\" (UniqueName: \"kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b\") pod \"00a4e8f4-395b-47b5-b52d-0035c6342061\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.488140 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca\") pod \"00a4e8f4-395b-47b5-b52d-0035c6342061\" (UID: \"00a4e8f4-395b-47b5-b52d-0035c6342061\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.488994 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca" (OuterVolumeSpecName: "client-ca") pod "00a4e8f4-395b-47b5-b52d-0035c6342061" (UID: "00a4e8f4-395b-47b5-b52d-0035c6342061"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.489378 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config" (OuterVolumeSpecName: "config") pod "00a4e8f4-395b-47b5-b52d-0035c6342061" (UID: "00a4e8f4-395b-47b5-b52d-0035c6342061"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.496325 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b" (OuterVolumeSpecName: "kube-api-access-5cr4b") pod "00a4e8f4-395b-47b5-b52d-0035c6342061" (UID: "00a4e8f4-395b-47b5-b52d-0035c6342061"). InnerVolumeSpecName "kube-api-access-5cr4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.496411 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00a4e8f4-395b-47b5-b52d-0035c6342061" (UID: "00a4e8f4-395b-47b5-b52d-0035c6342061"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.523495 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.589920 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles\") pod \"e25e09ba-054e-4825-8e1f-a810ddbc9444\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.590002 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert\") pod \"e25e09ba-054e-4825-8e1f-a810ddbc9444\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.590121 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config\") pod \"e25e09ba-054e-4825-8e1f-a810ddbc9444\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.590164 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mshgh\" (UniqueName: \"kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh\") pod \"e25e09ba-054e-4825-8e1f-a810ddbc9444\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.590921 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e25e09ba-054e-4825-8e1f-a810ddbc9444" (UID: "e25e09ba-054e-4825-8e1f-a810ddbc9444"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.591088 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config" (OuterVolumeSpecName: "config") pod "e25e09ba-054e-4825-8e1f-a810ddbc9444" (UID: "e25e09ba-054e-4825-8e1f-a810ddbc9444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.591404 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca\") pod \"e25e09ba-054e-4825-8e1f-a810ddbc9444\" (UID: \"e25e09ba-054e-4825-8e1f-a810ddbc9444\") " Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.591661 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca" (OuterVolumeSpecName: "client-ca") pod "e25e09ba-054e-4825-8e1f-a810ddbc9444" (UID: "e25e09ba-054e-4825-8e1f-a810ddbc9444"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592066 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a4e8f4-395b-47b5-b52d-0035c6342061-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592088 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592099 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592113 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cr4b\" (UniqueName: \"kubernetes.io/projected/00a4e8f4-395b-47b5-b52d-0035c6342061-kube-api-access-5cr4b\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592122 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592136 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a4e8f4-395b-47b5-b52d-0035c6342061-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.592145 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e25e09ba-054e-4825-8e1f-a810ddbc9444-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.598263 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e25e09ba-054e-4825-8e1f-a810ddbc9444" (UID: "e25e09ba-054e-4825-8e1f-a810ddbc9444"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.598283 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh" (OuterVolumeSpecName: "kube-api-access-mshgh") pod "e25e09ba-054e-4825-8e1f-a810ddbc9444" (UID: "e25e09ba-054e-4825-8e1f-a810ddbc9444"). InnerVolumeSpecName "kube-api-access-mshgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.638005 4912 generic.go:334] "Generic (PLEG): container finished" podID="00a4e8f4-395b-47b5-b52d-0035c6342061" containerID="b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba" exitCode=0 Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.638134 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.638104 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" event={"ID":"00a4e8f4-395b-47b5-b52d-0035c6342061","Type":"ContainerDied","Data":"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.638734 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv" event={"ID":"00a4e8f4-395b-47b5-b52d-0035c6342061","Type":"ContainerDied","Data":"a68520c0f504c23e296f989d9c87a17f31102025da89ac5bf880bd19296de345"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.639019 4912 scope.go:117] "RemoveContainer" containerID="b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.641534 4912 generic.go:334] "Generic (PLEG): container finished" podID="840ef508-c05b-4b3b-bb16-e15729003be1" containerID="87efa23e646dc6a554baaaa7f2d3e352bc64777379eee855fcd5db87b03429f1" exitCode=0 Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.641631 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerDied","Data":"87efa23e646dc6a554baaaa7f2d3e352bc64777379eee855fcd5db87b03429f1"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.672010 4912 scope.go:117] "RemoveContainer" containerID="b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba" Mar 18 13:05:47 crc kubenswrapper[4912]: E0318 13:05:47.673774 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba\": container with ID starting with b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba not found: ID does not exist" containerID="b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.673832 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba"} err="failed to get container status \"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba\": rpc error: code = NotFound desc = could not find container \"b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba\": container with ID starting with b45394025039b995eed4b2ddcfa29c6dcb2d2b0eea21e5830bfadda235a131ba not found: ID does not exist" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.674103 4912 generic.go:334] "Generic (PLEG): container finished" podID="e25e09ba-054e-4825-8e1f-a810ddbc9444" containerID="606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993" exitCode=0 Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.674344 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.674351 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" event={"ID":"e25e09ba-054e-4825-8e1f-a810ddbc9444","Type":"ContainerDied","Data":"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.674460 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68b4975746-rrxsl" event={"ID":"e25e09ba-054e-4825-8e1f-a810ddbc9444","Type":"ContainerDied","Data":"5978d3ed957359a715aa0204613ba01269a33704d8222573ad60a48c707b6d6e"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.674494 4912 scope.go:117] "RemoveContainer" containerID="606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.690295 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerStarted","Data":"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.696513 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mshgh\" (UniqueName: \"kubernetes.io/projected/e25e09ba-054e-4825-8e1f-a810ddbc9444-kube-api-access-mshgh\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.696533 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25e09ba-054e-4825-8e1f-a810ddbc9444-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.706009 4912 scope.go:117] "RemoveContainer" containerID="606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993" Mar 18 13:05:47 crc kubenswrapper[4912]: E0318 13:05:47.706638 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993\": container with ID starting with 606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993 not found: ID does not exist" containerID="606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.706680 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993"} err="failed to get container status \"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993\": rpc error: code = NotFound desc = could not find container \"606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993\": container with ID starting with 606759e37e5d8ab65d747703b5a7cc9c1ecbfa16b29c2943c3bc4bb3d2ffc993 not found: ID does not exist" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.710065 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerStarted","Data":"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.712863 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerStarted","Data":"0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550"} Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.719211 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.720807 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f7d7d65f-8k6vv"] Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.739625 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9dkt" podStartSLOduration=4.203457216 podStartE2EDuration="1m1.739590462s" podCreationTimestamp="2026-03-18 13:04:46 +0000 UTC" firstStartedPulling="2026-03-18 13:04:49.522633232 +0000 UTC m=+137.982060657" lastFinishedPulling="2026-03-18 13:05:47.058766478 +0000 UTC m=+195.518193903" observedRunningTime="2026-03-18 13:05:47.73889895 +0000 UTC m=+196.198326385" watchObservedRunningTime="2026-03-18 13:05:47.739590462 +0000 UTC m=+196.199017897" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.759920 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hj4gk" podStartSLOduration=2.2943317 podStartE2EDuration="59.759897444s" podCreationTimestamp="2026-03-18 13:04:48 +0000 UTC" firstStartedPulling="2026-03-18 13:04:49.648638313 +0000 UTC m=+138.108065738" lastFinishedPulling="2026-03-18 13:05:47.114204057 +0000 UTC m=+195.573631482" observedRunningTime="2026-03-18 13:05:47.756600169 +0000 UTC m=+196.216027614" watchObservedRunningTime="2026-03-18 13:05:47.759897444 +0000 UTC m=+196.219324869" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.778365 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-65p9d" podStartSLOduration=3.548242694 podStartE2EDuration="59.778341697s" podCreationTimestamp="2026-03-18 13:04:48 +0000 UTC" firstStartedPulling="2026-03-18 13:04:50.78073174 +0000 UTC m=+139.240159165" lastFinishedPulling="2026-03-18 13:05:47.010830743 +0000 UTC m=+195.470258168" observedRunningTime="2026-03-18 13:05:47.777258182 +0000 UTC m=+196.236685607" watchObservedRunningTime="2026-03-18 13:05:47.778341697 +0000 UTC m=+196.237769122" Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.793083 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:47 crc kubenswrapper[4912]: I0318 13:05:47.797945 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68b4975746-rrxsl"] Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.066031 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gl7cm" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:05:48 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:05:48 crc kubenswrapper[4912]: > Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.236868 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4e8f4-395b-47b5-b52d-0035c6342061" path="/var/lib/kubelet/pods/00a4e8f4-395b-47b5-b52d-0035c6342061/volumes" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.237828 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25e09ba-054e-4825-8e1f-a810ddbc9444" path="/var/lib/kubelet/pods/e25e09ba-054e-4825-8e1f-a810ddbc9444/volumes" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238479 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:05:48 crc kubenswrapper[4912]: E0318 13:05:48.238698 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="extract-content" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238716 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="extract-content" Mar 18 13:05:48 crc kubenswrapper[4912]: E0318 13:05:48.238746 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="registry-server" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238754 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="registry-server" Mar 18 13:05:48 crc kubenswrapper[4912]: E0318 13:05:48.238764 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="extract-utilities" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238770 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="extract-utilities" Mar 18 13:05:48 crc kubenswrapper[4912]: E0318 13:05:48.238781 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4e8f4-395b-47b5-b52d-0035c6342061" containerName="route-controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238787 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4e8f4-395b-47b5-b52d-0035c6342061" containerName="route-controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: E0318 13:05:48.238798 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25e09ba-054e-4825-8e1f-a810ddbc9444" containerName="controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238804 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25e09ba-054e-4825-8e1f-a810ddbc9444" containerName="controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238919 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25e09ba-054e-4825-8e1f-a810ddbc9444" containerName="controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238935 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4e8f4-395b-47b5-b52d-0035c6342061" containerName="route-controller-manager" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.238944 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba477a7f-ee05-44cf-be52-4c67c7a50192" containerName="registry-server" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.239450 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.239724 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.240607 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.244117 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.244714 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.244736 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.245496 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.245679 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.245862 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.246023 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.246352 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.246468 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.246583 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.250393 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.250725 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.252660 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.252869 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.255634 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.305922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26z9s\" (UniqueName: \"kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306058 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306092 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306117 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306142 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306223 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306309 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8ss\" (UniqueName: \"kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306338 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.306382 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407297 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407351 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26z9s\" (UniqueName: \"kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407398 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407419 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407435 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407454 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407483 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407509 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8ss\" (UniqueName: \"kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.407535 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.409302 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.409317 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.409372 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.409653 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.410148 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.413999 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.423484 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.431857 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26z9s\" (UniqueName: \"kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s\") pod \"controller-manager-6f4d668c6b-bx78m\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.434905 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8ss\" (UniqueName: \"kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss\") pod \"route-controller-manager-6d96bbbfc9-dgtz2\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.473003 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.473394 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.559686 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.576732 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.723726 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerStarted","Data":"3531409c4a4c22761363adc2d82e05b552c2827e0d3b2f46f02c76e308ad471a"} Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.755676 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-crcqh" podStartSLOduration=2.487448899 podStartE2EDuration="59.753028134s" podCreationTimestamp="2026-03-18 13:04:49 +0000 UTC" firstStartedPulling="2026-03-18 13:04:50.821352306 +0000 UTC m=+139.280779731" lastFinishedPulling="2026-03-18 13:05:48.086931541 +0000 UTC m=+196.546358966" observedRunningTime="2026-03-18 13:05:48.751107733 +0000 UTC m=+197.210535178" watchObservedRunningTime="2026-03-18 13:05:48.753028134 +0000 UTC m=+197.212455559" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.778481 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:05:48 crc kubenswrapper[4912]: I0318 13:05:48.779284 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.077920 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:05:49 crc kubenswrapper[4912]: W0318 13:05:49.083739 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd3b680_f303_4dcf_a99d_29f16d46cfdf.slice/crio-5a1fc5995adeec269da68eac6798e261c2c98fbb622f951da65862be5b4240e7 WatchSource:0}: Error finding container 5a1fc5995adeec269da68eac6798e261c2c98fbb622f951da65862be5b4240e7: Status 404 returned error can't find the container with id 5a1fc5995adeec269da68eac6798e261c2c98fbb622f951da65862be5b4240e7 Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.162358 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:05:49 crc kubenswrapper[4912]: W0318 13:05:49.181966 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25454711_58f4_4b39_96fb_2783d2cf6045.slice/crio-48c059017a0b2f5ed558ff53e3909bc46a721d90c4dac80703bb32fb1a87c5c9 WatchSource:0}: Error finding container 48c059017a0b2f5ed558ff53e3909bc46a721d90c4dac80703bb32fb1a87c5c9: Status 404 returned error can't find the container with id 48c059017a0b2f5ed558ff53e3909bc46a721d90c4dac80703bb32fb1a87c5c9 Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.520569 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-65p9d" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:05:49 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:05:49 crc kubenswrapper[4912]: > Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.739922 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" event={"ID":"6dd3b680-f303-4dcf-a99d-29f16d46cfdf","Type":"ContainerStarted","Data":"498b437f0e25f13d2569efdd49c1da3747602c51b4767846e0cb614ed3af823b"} Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.740492 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" event={"ID":"6dd3b680-f303-4dcf-a99d-29f16d46cfdf","Type":"ContainerStarted","Data":"5a1fc5995adeec269da68eac6798e261c2c98fbb622f951da65862be5b4240e7"} Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.740520 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.743985 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" event={"ID":"25454711-58f4-4b39-96fb-2783d2cf6045","Type":"ContainerStarted","Data":"6eb827697350a19236e0ba745d1a6e915547a8ca608e8c9cad7f6e100fbeeb40"} Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.744102 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.744124 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" event={"ID":"25454711-58f4-4b39-96fb-2783d2cf6045","Type":"ContainerStarted","Data":"48c059017a0b2f5ed558ff53e3909bc46a721d90c4dac80703bb32fb1a87c5c9"} Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.749771 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.759958 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" podStartSLOduration=3.75993551 podStartE2EDuration="3.75993551s" podCreationTimestamp="2026-03-18 13:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:49.756486851 +0000 UTC m=+198.215914296" watchObservedRunningTime="2026-03-18 13:05:49.75993551 +0000 UTC m=+198.219362935" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.776345 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" podStartSLOduration=3.776319408 podStartE2EDuration="3.776319408s" podCreationTimestamp="2026-03-18 13:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:05:49.773462268 +0000 UTC m=+198.232889713" watchObservedRunningTime="2026-03-18 13:05:49.776319408 +0000 UTC m=+198.235746833" Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.828211 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hj4gk" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="registry-server" probeResult="failure" output=< Mar 18 13:05:49 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:05:49 crc kubenswrapper[4912]: > Mar 18 13:05:49 crc kubenswrapper[4912]: I0318 13:05:49.853494 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:05:50 crc kubenswrapper[4912]: I0318 13:05:50.068642 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:05:50 crc kubenswrapper[4912]: I0318 13:05:50.069665 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:05:50 crc kubenswrapper[4912]: I0318 13:05:50.664794 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" podUID="65b97390-afd1-41da-9b38-f3467a213007" containerName="oauth-openshift" containerID="cri-o://07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44" gracePeriod=15 Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.126994 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-crcqh" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="registry-server" probeResult="failure" output=< Mar 18 13:05:51 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:05:51 crc kubenswrapper[4912]: > Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.134267 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.262796 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.262916 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.262943 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.262984 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.263013 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.263084 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.263158 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.263844 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264081 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264199 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264282 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264313 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264339 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.264851 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.266086 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.266228 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.266353 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.266391 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.266436 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error\") pod \"65b97390-afd1-41da-9b38-f3467a213007\" (UID: \"65b97390-afd1-41da-9b38-f3467a213007\") " Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.267674 4912 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.267702 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.267720 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.267736 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.267751 4912 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65b97390-afd1-41da-9b38-f3467a213007-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.272406 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.273775 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.273949 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.274582 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.275380 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk" (OuterVolumeSpecName: "kube-api-access-fb2sk") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "kube-api-access-fb2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.275527 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.276130 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.276246 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.276534 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "65b97390-afd1-41da-9b38-f3467a213007" (UID: "65b97390-afd1-41da-9b38-f3467a213007"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.369859 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.370088 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371364 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371389 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2sk\" (UniqueName: \"kubernetes.io/projected/65b97390-afd1-41da-9b38-f3467a213007-kube-api-access-fb2sk\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371469 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371489 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371503 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371515 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.371526 4912 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65b97390-afd1-41da-9b38-f3467a213007-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.757142 4912 generic.go:334] "Generic (PLEG): container finished" podID="65b97390-afd1-41da-9b38-f3467a213007" containerID="07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44" exitCode=0 Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.757238 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.757756 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" event={"ID":"65b97390-afd1-41da-9b38-f3467a213007","Type":"ContainerDied","Data":"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44"} Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.757876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8fqp" event={"ID":"65b97390-afd1-41da-9b38-f3467a213007","Type":"ContainerDied","Data":"41282a9362f819be4f2434ed2c1e53e2535e8b60b870a607f3007f4832c8a4d5"} Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.757915 4912 scope.go:117] "RemoveContainer" containerID="07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.780524 4912 scope.go:117] "RemoveContainer" containerID="07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44" Mar 18 13:05:51 crc kubenswrapper[4912]: E0318 13:05:51.781157 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44\": container with ID starting with 07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44 not found: ID does not exist" containerID="07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.781236 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44"} err="failed to get container status \"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44\": rpc error: code = NotFound desc = could not find container \"07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44\": container with ID starting with 07be49f9c1940a4431f4954d8e797c5bc61452d3af7fcbc4ae992eb4f689ce44 not found: ID does not exist" Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.797369 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:05:51 crc kubenswrapper[4912]: I0318 13:05:51.802348 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8fqp"] Mar 18 13:05:52 crc kubenswrapper[4912]: I0318 13:05:52.239755 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b97390-afd1-41da-9b38-f3467a213007" path="/var/lib/kubelet/pods/65b97390-afd1-41da-9b38-f3467a213007/volumes" Mar 18 13:05:56 crc kubenswrapper[4912]: I0318 13:05:56.031616 4912 ???:1] "http: TLS handshake error from 192.168.126.11:54276: no serving certificate available for the kubelet" Mar 18 13:05:56 crc kubenswrapper[4912]: I0318 13:05:56.673879 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:05:56 crc kubenswrapper[4912]: I0318 13:05:56.675597 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:05:56 crc kubenswrapper[4912]: I0318 13:05:56.714372 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:05:56 crc kubenswrapper[4912]: I0318 13:05:56.839919 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:05:57 crc kubenswrapper[4912]: I0318 13:05:57.063240 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:05:57 crc kubenswrapper[4912]: I0318 13:05:57.104933 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:05:57 crc kubenswrapper[4912]: I0318 13:05:57.965391 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:05:58 crc kubenswrapper[4912]: I0318 13:05:58.525018 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:05:58 crc kubenswrapper[4912]: I0318 13:05:58.570771 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:05:58 crc kubenswrapper[4912]: I0318 13:05:58.809741 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gl7cm" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="registry-server" containerID="cri-o://ced6c6ba4a8f9bebe479004ec598cb528f05cca9587ef5c8e65b1baf64e7f19d" gracePeriod=2 Mar 18 13:05:58 crc kubenswrapper[4912]: I0318 13:05:58.841961 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:05:58 crc kubenswrapper[4912]: I0318 13:05:58.891865 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.237685 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f78599457-wr7bj"] Mar 18 13:05:59 crc kubenswrapper[4912]: E0318 13:05:59.238474 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b97390-afd1-41da-9b38-f3467a213007" containerName="oauth-openshift" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.238498 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b97390-afd1-41da-9b38-f3467a213007" containerName="oauth-openshift" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.238614 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b97390-afd1-41da-9b38-f3467a213007" containerName="oauth-openshift" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.239307 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.243718 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.243887 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.244443 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.244688 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.244695 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.244956 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.245416 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.245456 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.245527 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.245605 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.246000 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.246571 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.252873 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.257925 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.262407 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f78599457-wr7bj"] Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.263741 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293087 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293171 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-session\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293521 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-login\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293620 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293662 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcn9\" (UniqueName: \"kubernetes.io/projected/7aae5da4-fdd1-4295-bfed-a10638501acf-kube-api-access-7xcn9\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293693 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293781 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-dir\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293807 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-policies\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293867 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293894 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.293930 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.294089 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.294130 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.294161 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-error\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395772 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-dir\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395835 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-policies\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395870 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395901 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395968 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-dir\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.395985 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396106 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-error\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396334 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396377 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-session\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396407 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-login\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396432 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396459 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xcn9\" (UniqueName: \"kubernetes.io/projected/7aae5da4-fdd1-4295-bfed-a10638501acf-kube-api-access-7xcn9\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.396483 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.397833 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-audit-policies\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.399143 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.399184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.399677 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.404090 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-session\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.404174 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.404341 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-error\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.404361 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.405493 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-login\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.409314 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.409573 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.410677 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aae5da4-fdd1-4295-bfed-a10638501acf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.414351 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xcn9\" (UniqueName: \"kubernetes.io/projected/7aae5da4-fdd1-4295-bfed-a10638501acf-kube-api-access-7xcn9\") pod \"oauth-openshift-5f78599457-wr7bj\" (UID: \"7aae5da4-fdd1-4295-bfed-a10638501acf\") " pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.559505 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.821344 4912 generic.go:334] "Generic (PLEG): container finished" podID="3af91020-4095-48f0-9457-b171de576fe0" containerID="ced6c6ba4a8f9bebe479004ec598cb528f05cca9587ef5c8e65b1baf64e7f19d" exitCode=0 Mar 18 13:05:59 crc kubenswrapper[4912]: I0318 13:05:59.821479 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerDied","Data":"ced6c6ba4a8f9bebe479004ec598cb528f05cca9587ef5c8e65b1baf64e7f19d"} Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.017945 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f78599457-wr7bj"] Mar 18 13:06:00 crc kubenswrapper[4912]: W0318 13:06:00.025267 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aae5da4_fdd1_4295_bfed_a10638501acf.slice/crio-6886622fb351a6226230222d02dd4ef2f3fe82750b6e14261e160297b465ed58 WatchSource:0}: Error finding container 6886622fb351a6226230222d02dd4ef2f3fe82750b6e14261e160297b465ed58: Status 404 returned error can't find the container with id 6886622fb351a6226230222d02dd4ef2f3fe82750b6e14261e160297b465ed58 Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.032634 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.104769 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities\") pod \"3af91020-4095-48f0-9457-b171de576fe0\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.104986 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content\") pod \"3af91020-4095-48f0-9457-b171de576fe0\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.105030 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7rtr\" (UniqueName: \"kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr\") pod \"3af91020-4095-48f0-9457-b171de576fe0\" (UID: \"3af91020-4095-48f0-9457-b171de576fe0\") " Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.105844 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities" (OuterVolumeSpecName: "utilities") pod "3af91020-4095-48f0-9457-b171de576fe0" (UID: "3af91020-4095-48f0-9457-b171de576fe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.111180 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr" (OuterVolumeSpecName: "kube-api-access-q7rtr") pod "3af91020-4095-48f0-9457-b171de576fe0" (UID: "3af91020-4095-48f0-9457-b171de576fe0"). InnerVolumeSpecName "kube-api-access-q7rtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.115649 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.163493 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563986-rl8cg"] Mar 18 13:06:00 crc kubenswrapper[4912]: E0318 13:06:00.164369 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="extract-content" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.164391 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="extract-content" Mar 18 13:06:00 crc kubenswrapper[4912]: E0318 13:06:00.164413 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="registry-server" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.164422 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="registry-server" Mar 18 13:06:00 crc kubenswrapper[4912]: E0318 13:06:00.164441 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="extract-utilities" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.164452 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="extract-utilities" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.164593 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af91020-4095-48f0-9457-b171de576fe0" containerName="registry-server" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.165161 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.169433 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.169839 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.169433 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.172466 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-rl8cg"] Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.179572 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3af91020-4095-48f0-9457-b171de576fe0" (UID: "3af91020-4095-48f0-9457-b171de576fe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.183564 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.206603 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.207121 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7rtr\" (UniqueName: \"kubernetes.io/projected/3af91020-4095-48f0-9457-b171de576fe0-kube-api-access-q7rtr\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.207210 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3af91020-4095-48f0-9457-b171de576fe0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.308780 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktft\" (UniqueName: \"kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft\") pod \"auto-csr-approver-29563986-rl8cg\" (UID: \"b2cbf235-6dbe-4747-b167-89f2593c2ee9\") " pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.410523 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktft\" (UniqueName: \"kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft\") pod \"auto-csr-approver-29563986-rl8cg\" (UID: \"b2cbf235-6dbe-4747-b167-89f2593c2ee9\") " pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.432417 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktft\" (UniqueName: \"kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft\") pod \"auto-csr-approver-29563986-rl8cg\" (UID: \"b2cbf235-6dbe-4747-b167-89f2593c2ee9\") " pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.481257 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.757888 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.832151 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl7cm" event={"ID":"3af91020-4095-48f0-9457-b171de576fe0","Type":"ContainerDied","Data":"5e8b12566b2e75153fd64dea4c3174a65ba93eceddf2d2a56a1a893f5e1ab055"} Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.832218 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl7cm" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.832239 4912 scope.go:117] "RemoveContainer" containerID="ced6c6ba4a8f9bebe479004ec598cb528f05cca9587ef5c8e65b1baf64e7f19d" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.837597 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" event={"ID":"7aae5da4-fdd1-4295-bfed-a10638501acf","Type":"ContainerStarted","Data":"2d3916fa1a525ce458866b8ee169fe7859782f6da0b61c3d51454e00f476b357"} Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.837656 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" event={"ID":"7aae5da4-fdd1-4295-bfed-a10638501acf","Type":"ContainerStarted","Data":"6886622fb351a6226230222d02dd4ef2f3fe82750b6e14261e160297b465ed58"} Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.837812 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hj4gk" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="registry-server" containerID="cri-o://2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b" gracePeriod=2 Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.839523 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.847601 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.856516 4912 scope.go:117] "RemoveContainer" containerID="2c9f84ac8a9e710c3a8dbd2ffe63ffdcd56bfabeee96dab9956ccc5492389e73" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.873333 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podStartSLOduration=35.873303409 podStartE2EDuration="35.873303409s" podCreationTimestamp="2026-03-18 13:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:06:00.861418163 +0000 UTC m=+209.320845608" watchObservedRunningTime="2026-03-18 13:06:00.873303409 +0000 UTC m=+209.332730834" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.882719 4912 scope.go:117] "RemoveContainer" containerID="9d57efb31fecfd9de22088df406050103e36b7847eaffe89e0da353ee20f21d8" Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.914103 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.927841 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gl7cm"] Mar 18 13:06:00 crc kubenswrapper[4912]: I0318 13:06:00.936595 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-rl8cg"] Mar 18 13:06:00 crc kubenswrapper[4912]: W0318 13:06:00.983616 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2cbf235_6dbe_4747_b167_89f2593c2ee9.slice/crio-05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd WatchSource:0}: Error finding container 05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd: Status 404 returned error can't find the container with id 05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.447581 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.526732 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d58k4\" (UniqueName: \"kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4\") pod \"8c189f26-c791-48a6-a060-9982c8666243\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.526884 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities\") pod \"8c189f26-c791-48a6-a060-9982c8666243\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.526935 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content\") pod \"8c189f26-c791-48a6-a060-9982c8666243\" (UID: \"8c189f26-c791-48a6-a060-9982c8666243\") " Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.529353 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities" (OuterVolumeSpecName: "utilities") pod "8c189f26-c791-48a6-a060-9982c8666243" (UID: "8c189f26-c791-48a6-a060-9982c8666243"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.536248 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4" (OuterVolumeSpecName: "kube-api-access-d58k4") pod "8c189f26-c791-48a6-a060-9982c8666243" (UID: "8c189f26-c791-48a6-a060-9982c8666243"). InnerVolumeSpecName "kube-api-access-d58k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.555986 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c189f26-c791-48a6-a060-9982c8666243" (UID: "8c189f26-c791-48a6-a060-9982c8666243"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.628516 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d58k4\" (UniqueName: \"kubernetes.io/projected/8c189f26-c791-48a6-a060-9982c8666243-kube-api-access-d58k4\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.629002 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.629197 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c189f26-c791-48a6-a060-9982c8666243-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.845325 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" event={"ID":"b2cbf235-6dbe-4747-b167-89f2593c2ee9","Type":"ContainerStarted","Data":"05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd"} Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.849817 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c189f26-c791-48a6-a060-9982c8666243" containerID="2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b" exitCode=0 Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.849893 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj4gk" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.849913 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerDied","Data":"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b"} Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.849989 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj4gk" event={"ID":"8c189f26-c791-48a6-a060-9982c8666243","Type":"ContainerDied","Data":"f98f62995c2fdc8fb22e64a3c977fb8148e9252a6634165d0e551d53c2577dc5"} Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.850019 4912 scope.go:117] "RemoveContainer" containerID="2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.875335 4912 scope.go:117] "RemoveContainer" containerID="89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.883695 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.887147 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj4gk"] Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.897620 4912 scope.go:117] "RemoveContainer" containerID="7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.919423 4912 scope.go:117] "RemoveContainer" containerID="2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b" Mar 18 13:06:01 crc kubenswrapper[4912]: E0318 13:06:01.920172 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b\": container with ID starting with 2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b not found: ID does not exist" containerID="2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.920246 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b"} err="failed to get container status \"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b\": rpc error: code = NotFound desc = could not find container \"2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b\": container with ID starting with 2843de57a53101800f7c738bded8db3a6b4db56bc38ad4df6eaf587ce12efa0b not found: ID does not exist" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.920282 4912 scope.go:117] "RemoveContainer" containerID="89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730" Mar 18 13:06:01 crc kubenswrapper[4912]: E0318 13:06:01.920600 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730\": container with ID starting with 89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730 not found: ID does not exist" containerID="89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.920633 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730"} err="failed to get container status \"89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730\": rpc error: code = NotFound desc = could not find container \"89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730\": container with ID starting with 89ad304d9989a4298a51b962e648aa29152e92318cad06551db13c2bcbd2c730 not found: ID does not exist" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.920653 4912 scope.go:117] "RemoveContainer" containerID="7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc" Mar 18 13:06:01 crc kubenswrapper[4912]: E0318 13:06:01.920964 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc\": container with ID starting with 7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc not found: ID does not exist" containerID="7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc" Mar 18 13:06:01 crc kubenswrapper[4912]: I0318 13:06:01.921002 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc"} err="failed to get container status \"7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc\": rpc error: code = NotFound desc = could not find container \"7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc\": container with ID starting with 7faef420183e41246450e0562ffc7ea44188136beebdfcc8c1377e7823d163dc not found: ID does not exist" Mar 18 13:06:02 crc kubenswrapper[4912]: I0318 13:06:02.237599 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af91020-4095-48f0-9457-b171de576fe0" path="/var/lib/kubelet/pods/3af91020-4095-48f0-9457-b171de576fe0/volumes" Mar 18 13:06:02 crc kubenswrapper[4912]: I0318 13:06:02.238329 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c189f26-c791-48a6-a060-9982c8666243" path="/var/lib/kubelet/pods/8c189f26-c791-48a6-a060-9982c8666243/volumes" Mar 18 13:06:03 crc kubenswrapper[4912]: I0318 13:06:03.159668 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:06:03 crc kubenswrapper[4912]: I0318 13:06:03.160549 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-crcqh" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="registry-server" containerID="cri-o://3531409c4a4c22761363adc2d82e05b552c2827e0d3b2f46f02c76e308ad471a" gracePeriod=2 Mar 18 13:06:03 crc kubenswrapper[4912]: I0318 13:06:03.868105 4912 generic.go:334] "Generic (PLEG): container finished" podID="840ef508-c05b-4b3b-bb16-e15729003be1" containerID="3531409c4a4c22761363adc2d82e05b552c2827e0d3b2f46f02c76e308ad471a" exitCode=0 Mar 18 13:06:03 crc kubenswrapper[4912]: I0318 13:06:03.868167 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerDied","Data":"3531409c4a4c22761363adc2d82e05b552c2827e0d3b2f46f02c76e308ad471a"} Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.735587 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.736433 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerName="controller-manager" containerID="cri-o://498b437f0e25f13d2569efdd49c1da3747602c51b4767846e0cb614ed3af823b" gracePeriod=30 Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.852083 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.852334 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" containerName="route-controller-manager" containerID="cri-o://6eb827697350a19236e0ba745d1a6e915547a8ca608e8c9cad7f6e100fbeeb40" gracePeriod=30 Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.890519 4912 generic.go:334] "Generic (PLEG): container finished" podID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerID="498b437f0e25f13d2569efdd49c1da3747602c51b4767846e0cb614ed3af823b" exitCode=0 Mar 18 13:06:06 crc kubenswrapper[4912]: I0318 13:06:06.890585 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" event={"ID":"6dd3b680-f303-4dcf-a99d-29f16d46cfdf","Type":"ContainerDied","Data":"498b437f0e25f13d2569efdd49c1da3747602c51b4767846e0cb614ed3af823b"} Mar 18 13:06:07 crc kubenswrapper[4912]: I0318 13:06:07.899451 4912 generic.go:334] "Generic (PLEG): container finished" podID="25454711-58f4-4b39-96fb-2783d2cf6045" containerID="6eb827697350a19236e0ba745d1a6e915547a8ca608e8c9cad7f6e100fbeeb40" exitCode=0 Mar 18 13:06:07 crc kubenswrapper[4912]: I0318 13:06:07.899519 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" event={"ID":"25454711-58f4-4b39-96fb-2783d2cf6045","Type":"ContainerDied","Data":"6eb827697350a19236e0ba745d1a6e915547a8ca608e8c9cad7f6e100fbeeb40"} Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.226735 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.337085 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities\") pod \"840ef508-c05b-4b3b-bb16-e15729003be1\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.337165 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content\") pod \"840ef508-c05b-4b3b-bb16-e15729003be1\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.337196 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5qbg\" (UniqueName: \"kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg\") pod \"840ef508-c05b-4b3b-bb16-e15729003be1\" (UID: \"840ef508-c05b-4b3b-bb16-e15729003be1\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.338893 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities" (OuterVolumeSpecName: "utilities") pod "840ef508-c05b-4b3b-bb16-e15729003be1" (UID: "840ef508-c05b-4b3b-bb16-e15729003be1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.348071 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg" (OuterVolumeSpecName: "kube-api-access-k5qbg") pod "840ef508-c05b-4b3b-bb16-e15729003be1" (UID: "840ef508-c05b-4b3b-bb16-e15729003be1"). InnerVolumeSpecName "kube-api-access-k5qbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.439179 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.439210 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5qbg\" (UniqueName: \"kubernetes.io/projected/840ef508-c05b-4b3b-bb16-e15729003be1-kube-api-access-k5qbg\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.493632 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840ef508-c05b-4b3b-bb16-e15729003be1" (UID: "840ef508-c05b-4b3b-bb16-e15729003be1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.541058 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840ef508-c05b-4b3b-bb16-e15729003be1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.649518 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.658446 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.745835 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26z9s\" (UniqueName: \"kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s\") pod \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.745912 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca\") pod \"25454711-58f4-4b39-96fb-2783d2cf6045\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.745950 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert\") pod \"25454711-58f4-4b39-96fb-2783d2cf6045\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.745980 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles\") pod \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.746003 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config\") pod \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.746308 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd8ss\" (UniqueName: \"kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss\") pod \"25454711-58f4-4b39-96fb-2783d2cf6045\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.746412 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert\") pod \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.746487 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config\") pod \"25454711-58f4-4b39-96fb-2783d2cf6045\" (UID: \"25454711-58f4-4b39-96fb-2783d2cf6045\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.746601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca\") pod \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\" (UID: \"6dd3b680-f303-4dcf-a99d-29f16d46cfdf\") " Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747182 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6dd3b680-f303-4dcf-a99d-29f16d46cfdf" (UID: "6dd3b680-f303-4dcf-a99d-29f16d46cfdf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747212 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config" (OuterVolumeSpecName: "config") pod "6dd3b680-f303-4dcf-a99d-29f16d46cfdf" (UID: "6dd3b680-f303-4dcf-a99d-29f16d46cfdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747205 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca" (OuterVolumeSpecName: "client-ca") pod "25454711-58f4-4b39-96fb-2783d2cf6045" (UID: "25454711-58f4-4b39-96fb-2783d2cf6045"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747600 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config" (OuterVolumeSpecName: "config") pod "25454711-58f4-4b39-96fb-2783d2cf6045" (UID: "25454711-58f4-4b39-96fb-2783d2cf6045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747786 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "6dd3b680-f303-4dcf-a99d-29f16d46cfdf" (UID: "6dd3b680-f303-4dcf-a99d-29f16d46cfdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.747993 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.748019 4912 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.748031 4912 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.748068 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.748080 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25454711-58f4-4b39-96fb-2783d2cf6045-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.751503 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss" (OuterVolumeSpecName: "kube-api-access-dd8ss") pod "25454711-58f4-4b39-96fb-2783d2cf6045" (UID: "25454711-58f4-4b39-96fb-2783d2cf6045"). InnerVolumeSpecName "kube-api-access-dd8ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.751926 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s" (OuterVolumeSpecName: "kube-api-access-26z9s") pod "6dd3b680-f303-4dcf-a99d-29f16d46cfdf" (UID: "6dd3b680-f303-4dcf-a99d-29f16d46cfdf"). InnerVolumeSpecName "kube-api-access-26z9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.753246 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6dd3b680-f303-4dcf-a99d-29f16d46cfdf" (UID: "6dd3b680-f303-4dcf-a99d-29f16d46cfdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.756255 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25454711-58f4-4b39-96fb-2783d2cf6045" (UID: "25454711-58f4-4b39-96fb-2783d2cf6045"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.849227 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.849281 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26z9s\" (UniqueName: \"kubernetes.io/projected/6dd3b680-f303-4dcf-a99d-29f16d46cfdf-kube-api-access-26z9s\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.849295 4912 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25454711-58f4-4b39-96fb-2783d2cf6045-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.849304 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd8ss\" (UniqueName: \"kubernetes.io/projected/25454711-58f4-4b39-96fb-2783d2cf6045-kube-api-access-dd8ss\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.907848 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" event={"ID":"b2cbf235-6dbe-4747-b167-89f2593c2ee9","Type":"ContainerStarted","Data":"56df29823074927f3beabd9ee4c3df2aa40497f476a8b01aa79403e9aacb998e"} Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.909853 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.909846 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" event={"ID":"25454711-58f4-4b39-96fb-2783d2cf6045","Type":"ContainerDied","Data":"48c059017a0b2f5ed558ff53e3909bc46a721d90c4dac80703bb32fb1a87c5c9"} Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.910004 4912 scope.go:117] "RemoveContainer" containerID="6eb827697350a19236e0ba745d1a6e915547a8ca608e8c9cad7f6e100fbeeb40" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.911793 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.912286 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" event={"ID":"6dd3b680-f303-4dcf-a99d-29f16d46cfdf","Type":"ContainerDied","Data":"5a1fc5995adeec269da68eac6798e261c2c98fbb622f951da65862be5b4240e7"} Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.914847 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crcqh" event={"ID":"840ef508-c05b-4b3b-bb16-e15729003be1","Type":"ContainerDied","Data":"53a468572fffa5642cc7febdcc839514ca0b215e4624e8562c56463540caf78c"} Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.914932 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crcqh" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.924240 4912 scope.go:117] "RemoveContainer" containerID="498b437f0e25f13d2569efdd49c1da3747602c51b4767846e0cb614ed3af823b" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.936873 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" podStartSLOduration=1.651187006 podStartE2EDuration="8.936850951s" podCreationTimestamp="2026-03-18 13:06:00 +0000 UTC" firstStartedPulling="2026-03-18 13:06:00.985935029 +0000 UTC m=+209.445362454" lastFinishedPulling="2026-03-18 13:06:08.271598974 +0000 UTC m=+216.731026399" observedRunningTime="2026-03-18 13:06:08.935373984 +0000 UTC m=+217.394801419" watchObservedRunningTime="2026-03-18 13:06:08.936850951 +0000 UTC m=+217.396278376" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.944684 4912 scope.go:117] "RemoveContainer" containerID="3531409c4a4c22761363adc2d82e05b552c2827e0d3b2f46f02c76e308ad471a" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.960686 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.963709 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.970206 4912 scope.go:117] "RemoveContainer" containerID="87efa23e646dc6a554baaaa7f2d3e352bc64777379eee855fcd5db87b03429f1" Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.971669 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.974504 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-crcqh"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.987181 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.990197 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f4d668c6b-bx78m"] Mar 18 13:06:08 crc kubenswrapper[4912]: I0318 13:06:08.992344 4912 scope.go:117] "RemoveContainer" containerID="b2f1258846ac7320a95e3ab94ca1e29e0c5ff644720cb146d4aec202fbd8c070" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.100673 4912 csr.go:261] certificate signing request csr-2w8sh is approved, waiting to be issued Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.107824 4912 csr.go:257] certificate signing request csr-2w8sh is issued Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158066 4912 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158348 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158370 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158384 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="extract-content" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158390 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="extract-content" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158398 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="extract-utilities" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158404 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="extract-utilities" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158414 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158419 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158428 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerName="controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158434 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerName="controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158443 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" containerName="route-controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158449 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" containerName="route-controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158459 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="extract-utilities" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158467 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="extract-utilities" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.158480 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="extract-content" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158486 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="extract-content" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158589 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c189f26-c791-48a6-a060-9982c8666243" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158603 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" containerName="registry-server" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158614 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerName="controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158619 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" containerName="route-controller-manager" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.158962 4912 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159147 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159346 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1" gracePeriod=15 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159372 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d" gracePeriod=15 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159486 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19" gracePeriod=15 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159508 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c" gracePeriod=15 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.159527 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4" gracePeriod=15 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160165 4912 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160294 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160306 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160315 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160321 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160331 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160337 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160343 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160349 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160357 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160363 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160370 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160376 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160383 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160389 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160400 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160407 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160498 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160505 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160511 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160517 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160524 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160534 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160542 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160551 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160674 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160681 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.160691 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160697 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.160799 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.254495 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255088 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255145 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255171 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255251 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255298 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255325 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.255366 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.261845 4912 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357029 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357156 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357193 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357216 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357198 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357278 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357234 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357318 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357336 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357363 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357376 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357401 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357415 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.357716 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.358255 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.358426 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.560478 4912 patch_prober.go:28] interesting pod/controller-manager-6f4d668c6b-bx78m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.560557 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f4d668c6b-bx78m" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.561163 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event=< Mar 18 13:06:09 crc kubenswrapper[4912]: &Event{ObjectMeta:{controller-manager-6f4d668c6b-bx78m.189df1539654713f openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f4d668c6b-bx78m,UID:6dd3b680-f303-4dcf-a99d-29f16d46cfdf,APIVersion:v1,ResourceVersion:29682,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.63:8443/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:06:09 crc kubenswrapper[4912]: body: Mar 18 13:06:09 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:06:09.560539455 +0000 UTC m=+218.019966880,LastTimestamp:2026-03-18 13:06:09.560539455 +0000 UTC m=+218.019966880,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:06:09 crc kubenswrapper[4912]: > Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.562448 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.578349 4912 patch_prober.go:28] interesting pod/route-controller-manager-6d96bbbfc9-dgtz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.578446 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d96bbbfc9-dgtz2" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:06:09 crc kubenswrapper[4912]: W0318 13:06:09.581542 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f993bc3bd0fdb73483067337f5476f3a30137ac54393b1f6964ebe3c397885fd WatchSource:0}: Error finding container f993bc3bd0fdb73483067337f5476f3a30137ac54393b1f6964ebe3c397885fd: Status 404 returned error can't find the container with id f993bc3bd0fdb73483067337f5476f3a30137ac54393b1f6964ebe3c397885fd Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.923102 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.924756 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.925950 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c" exitCode=0 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.925996 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d" exitCode=0 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.926012 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19" exitCode=0 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.926030 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4" exitCode=2 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.926111 4912 scope.go:117] "RemoveContainer" containerID="0997cec90d5793f5b28cfb4514dbd67bfc05f143ddd38cc76b342a07ac66a85b" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.928294 4912 generic.go:334] "Generic (PLEG): container finished" podID="cfc460f0-40a2-4a51-a361-d29032b64e15" containerID="5a8694b9c0aabfb5c5182aad4f39857090dddcd964722fff5ad2c6c0265287b6" exitCode=0 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.928414 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cfc460f0-40a2-4a51-a361-d29032b64e15","Type":"ContainerDied","Data":"5a8694b9c0aabfb5c5182aad4f39857090dddcd964722fff5ad2c6c0265287b6"} Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.929460 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.930120 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.930972 4912 generic.go:334] "Generic (PLEG): container finished" podID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" containerID="56df29823074927f3beabd9ee4c3df2aa40497f476a8b01aa79403e9aacb998e" exitCode=0 Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.931023 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" event={"ID":"b2cbf235-6dbe-4747-b167-89f2593c2ee9","Type":"ContainerDied","Data":"56df29823074927f3beabd9ee4c3df2aa40497f476a8b01aa79403e9aacb998e"} Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.931743 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.932138 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.932443 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.934973 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09"} Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.935016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f993bc3bd0fdb73483067337f5476f3a30137ac54393b1f6964ebe3c397885fd"} Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.935801 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: E0318 13:06:09.935895 4912 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.936415 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:09 crc kubenswrapper[4912]: I0318 13:06:09.936843 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.121771 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-19 04:27:51.576731985 +0000 UTC Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.121856 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7359h21m41.454879519s for next certificate rotation Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.235919 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25454711-58f4-4b39-96fb-2783d2cf6045" path="/var/lib/kubelet/pods/25454711-58f4-4b39-96fb-2783d2cf6045/volumes" Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.237332 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd3b680-f303-4dcf-a99d-29f16d46cfdf" path="/var/lib/kubelet/pods/6dd3b680-f303-4dcf-a99d-29f16d46cfdf/volumes" Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.237963 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840ef508-c05b-4b3b-bb16-e15729003be1" path="/var/lib/kubelet/pods/840ef508-c05b-4b3b-bb16-e15729003be1/volumes" Mar 18 13:06:10 crc kubenswrapper[4912]: I0318 13:06:10.944578 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.122394 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 06:50:54.596906549 +0000 UTC Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.122446 4912 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6233h44m43.474464216s for next certificate rotation Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.323029 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.324112 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.324865 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.357713 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.358231 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.358433 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.384530 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock\") pod \"cfc460f0-40a2-4a51-a361-d29032b64e15\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.384607 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access\") pod \"cfc460f0-40a2-4a51-a361-d29032b64e15\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.384646 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir\") pod \"cfc460f0-40a2-4a51-a361-d29032b64e15\" (UID: \"cfc460f0-40a2-4a51-a361-d29032b64e15\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.385029 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cfc460f0-40a2-4a51-a361-d29032b64e15" (UID: "cfc460f0-40a2-4a51-a361-d29032b64e15"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.385237 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock" (OuterVolumeSpecName: "var-lock") pod "cfc460f0-40a2-4a51-a361-d29032b64e15" (UID: "cfc460f0-40a2-4a51-a361-d29032b64e15"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.407905 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cfc460f0-40a2-4a51-a361-d29032b64e15" (UID: "cfc460f0-40a2-4a51-a361-d29032b64e15"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.486975 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktft\" (UniqueName: \"kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft\") pod \"b2cbf235-6dbe-4747-b167-89f2593c2ee9\" (UID: \"b2cbf235-6dbe-4747-b167-89f2593c2ee9\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.488795 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfc460f0-40a2-4a51-a361-d29032b64e15-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.488993 4912 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.489250 4912 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfc460f0-40a2-4a51-a361-d29032b64e15-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.492173 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft" (OuterVolumeSpecName: "kube-api-access-9ktft") pod "b2cbf235-6dbe-4747-b167-89f2593c2ee9" (UID: "b2cbf235-6dbe-4747-b167-89f2593c2ee9"). InnerVolumeSpecName "kube-api-access-9ktft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.554962 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.556022 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.556704 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.557376 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.557975 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.591373 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktft\" (UniqueName: \"kubernetes.io/projected/b2cbf235-6dbe-4747-b167-89f2593c2ee9-kube-api-access-9ktft\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693243 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693353 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693441 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693534 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693566 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693671 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693882 4912 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693907 4912 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.693925 4912 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.954634 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.954646 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cfc460f0-40a2-4a51-a361-d29032b64e15","Type":"ContainerDied","Data":"42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43"} Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.954892 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42d6ebcd7c46eccfb8d721c761978c9d7648e8116e8ea75b682b159666894f43" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.956757 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.957082 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" event={"ID":"b2cbf235-6dbe-4747-b167-89f2593c2ee9","Type":"ContainerDied","Data":"05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd"} Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.957118 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05224b97d5962b68efbbc162c696579edf77719ab3621337d0833a5c73bcc2dd" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.959401 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.959965 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1" exitCode=0 Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.960006 4912 scope.go:117] "RemoveContainer" containerID="cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.960122 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.970653 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.971000 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.971806 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.976065 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.976500 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.976695 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.977117 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.977518 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.977554 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.977884 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.978082 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.978254 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.978321 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.978555 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.978590 4912 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 13:06:11 crc kubenswrapper[4912]: E0318 13:06:11.978882 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 18 13:06:11 crc kubenswrapper[4912]: I0318 13:06:11.983875 4912 scope.go:117] "RemoveContainer" containerID="17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.001621 4912 scope.go:117] "RemoveContainer" containerID="793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.016183 4912 scope.go:117] "RemoveContainer" containerID="e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.031086 4912 scope.go:117] "RemoveContainer" containerID="2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.046867 4912 scope.go:117] "RemoveContainer" containerID="215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.070382 4912 scope.go:117] "RemoveContainer" containerID="cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.072606 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c\": container with ID starting with cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c not found: ID does not exist" containerID="cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.072675 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c"} err="failed to get container status \"cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c\": rpc error: code = NotFound desc = could not find container \"cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c\": container with ID starting with cf335cb0447d77275568ce718f3610a5d744595f1b74461d03f0ad9a08fe923c not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.072719 4912 scope.go:117] "RemoveContainer" containerID="17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.073552 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d\": container with ID starting with 17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d not found: ID does not exist" containerID="17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.073646 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d"} err="failed to get container status \"17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d\": rpc error: code = NotFound desc = could not find container \"17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d\": container with ID starting with 17e90a25f76a0efcfa5a0c9ba6fa3c12784e04094983743d4145ff32b22d585d not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.073703 4912 scope.go:117] "RemoveContainer" containerID="793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.074494 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19\": container with ID starting with 793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19 not found: ID does not exist" containerID="793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.074541 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19"} err="failed to get container status \"793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19\": rpc error: code = NotFound desc = could not find container \"793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19\": container with ID starting with 793f36a995569fa5e5c1fe63b04fed827674a181a2552a3043f0d831be239b19 not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.074574 4912 scope.go:117] "RemoveContainer" containerID="e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.074908 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4\": container with ID starting with e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4 not found: ID does not exist" containerID="e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.074976 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4"} err="failed to get container status \"e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4\": rpc error: code = NotFound desc = could not find container \"e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4\": container with ID starting with e4255d9a108087630240d24aa72d1d6987804cad07ad9989c6e1014ce046bea4 not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.075010 4912 scope.go:117] "RemoveContainer" containerID="2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.075430 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1\": container with ID starting with 2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1 not found: ID does not exist" containerID="2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.075464 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1"} err="failed to get container status \"2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1\": rpc error: code = NotFound desc = could not find container \"2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1\": container with ID starting with 2eaa64603c4dad57966b6e4e269db0d38e17962b11762cbbc3834fb77e0192d1 not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.075489 4912 scope.go:117] "RemoveContainer" containerID="215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.075803 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad\": container with ID starting with 215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad not found: ID does not exist" containerID="215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.075832 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad"} err="failed to get container status \"215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad\": rpc error: code = NotFound desc = could not find container \"215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad\": container with ID starting with 215d4ca5a59f6bb0424ea5d34b6e353e027a15a1014c822bec844b28fe29f1ad not found: ID does not exist" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.179600 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.261691 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.262173 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.262730 4912 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:12 crc kubenswrapper[4912]: I0318 13:06:12.265837 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 13:06:12 crc kubenswrapper[4912]: E0318 13:06:12.580280 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 18 13:06:13 crc kubenswrapper[4912]: E0318 13:06:13.381456 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 18 13:06:14 crc kubenswrapper[4912]: E0318 13:06:14.982071 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 18 13:06:17 crc kubenswrapper[4912]: E0318 13:06:17.836103 4912 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event=< Mar 18 13:06:17 crc kubenswrapper[4912]: &Event{ObjectMeta:{controller-manager-6f4d668c6b-bx78m.189df1539654713f openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f4d668c6b-bx78m,UID:6dd3b680-f303-4dcf-a99d-29f16d46cfdf,APIVersion:v1,ResourceVersion:29682,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.63:8443/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 13:06:17 crc kubenswrapper[4912]: body: Mar 18 13:06:17 crc kubenswrapper[4912]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:06:09.560539455 +0000 UTC m=+218.019966880,LastTimestamp:2026-03-18 13:06:09.560539455 +0000 UTC m=+218.019966880,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 13:06:17 crc kubenswrapper[4912]: > Mar 18 13:06:18 crc kubenswrapper[4912]: E0318 13:06:18.183493 4912 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="6.4s" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.227374 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.228292 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.229491 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.249692 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.249736 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:21 crc kubenswrapper[4912]: E0318 13:06:21.250307 4912 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:21 crc kubenswrapper[4912]: I0318 13:06:21.250963 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.022351 4912 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="026ca7aea41f38064934d3c573db2e8cf137bafbfb7c2c445578a7b76d361d18" exitCode=0 Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.022700 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"026ca7aea41f38064934d3c573db2e8cf137bafbfb7c2c445578a7b76d361d18"} Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.022795 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3bfb9e08f79c1daf209e8a8001411ae01fede8483127e36e5c239db79782aee0"} Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.023237 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.023257 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.023499 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: E0318 13:06:22.023558 4912 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.023676 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.026827 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.027476 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.027520 4912 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="033c8d97bacf98c7ec2e36fad49fb41b161b92e3d0ea907012c00b4248974787" exitCode=1 Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.027547 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"033c8d97bacf98c7ec2e36fad49fb41b161b92e3d0ea907012c00b4248974787"} Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.027906 4912 scope.go:117] "RemoveContainer" containerID="033c8d97bacf98c7ec2e36fad49fb41b161b92e3d0ea907012c00b4248974787" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.028763 4912 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.029281 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.029587 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.235211 4912 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.235762 4912 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.236130 4912 status_manager.go:851] "Failed to get status for pod" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" pod="openshift-infra/auto-csr-approver-29563986-rl8cg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563986-rl8cg\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.236583 4912 status_manager.go:851] "Failed to get status for pod" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 18 13:06:22 crc kubenswrapper[4912]: I0318 13:06:22.423317 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.039031 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e10906d92e4487626df13871969dfde6130a543dd0eda92135185e8816bab8db"} Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.039428 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3f7d21774400bb69e9e9cd84abd191afbabda9ac8380b0597f7297ea1886bb8d"} Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.039445 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a9806a188fb1b86ed4884c2f90679040129380320eef2de6db3ba81dad25e3a7"} Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.039457 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"82526d24c2e0d4e4c5e8d0397553a6d4938657ffb9fc25589de64b7336a9d3d6"} Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.044502 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.045235 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 13:06:23 crc kubenswrapper[4912]: I0318 13:06:23.045317 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"47de1d4d2a4cc0449a49a7726dfac56128bf96a71c4b88cf7a3a311bfe14b370"} Mar 18 13:06:24 crc kubenswrapper[4912]: I0318 13:06:24.054065 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"23bdac0c1cfbcdeb1fba55a43fb044cad2a9c2ac0ecdead53f4d743a25006e41"} Mar 18 13:06:24 crc kubenswrapper[4912]: I0318 13:06:24.054835 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:24 crc kubenswrapper[4912]: I0318 13:06:24.054860 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:25 crc kubenswrapper[4912]: I0318 13:06:25.820411 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:06:26 crc kubenswrapper[4912]: I0318 13:06:26.251929 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:26 crc kubenswrapper[4912]: I0318 13:06:26.251976 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:26 crc kubenswrapper[4912]: I0318 13:06:26.260400 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:29 crc kubenswrapper[4912]: I0318 13:06:29.069096 4912 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:30 crc kubenswrapper[4912]: I0318 13:06:30.086800 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:30 crc kubenswrapper[4912]: I0318 13:06:30.086834 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:30 crc kubenswrapper[4912]: I0318 13:06:30.087014 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:30 crc kubenswrapper[4912]: I0318 13:06:30.091302 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:30 crc kubenswrapper[4912]: I0318 13:06:30.094581 4912 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c167629a-fbd8-46b6-8075-faa8d0be6498" Mar 18 13:06:31 crc kubenswrapper[4912]: I0318 13:06:31.092392 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:31 crc kubenswrapper[4912]: I0318 13:06:31.092431 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:32 crc kubenswrapper[4912]: I0318 13:06:32.246677 4912 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c167629a-fbd8-46b6-8075-faa8d0be6498" Mar 18 13:06:32 crc kubenswrapper[4912]: I0318 13:06:32.424209 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:06:32 crc kubenswrapper[4912]: I0318 13:06:32.428342 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:06:33 crc kubenswrapper[4912]: I0318 13:06:33.111228 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:06:36 crc kubenswrapper[4912]: I0318 13:06:36.998719 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:06:36 crc kubenswrapper[4912]: I0318 13:06:36.999284 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:06:38 crc kubenswrapper[4912]: I0318 13:06:38.466237 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 13:06:38 crc kubenswrapper[4912]: I0318 13:06:38.619942 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 13:06:38 crc kubenswrapper[4912]: I0318 13:06:38.927100 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 13:06:39 crc kubenswrapper[4912]: I0318 13:06:39.130810 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 13:06:39 crc kubenswrapper[4912]: I0318 13:06:39.651016 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.093774 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.322016 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.407076 4912 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.422483 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.478428 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.536075 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.605276 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.827190 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 13:06:40 crc kubenswrapper[4912]: I0318 13:06:40.838801 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.116504 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.237163 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.253252 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.370713 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.378030 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.477336 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.550773 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.585586 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.720582 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.882545 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.901677 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 13:06:41 crc kubenswrapper[4912]: I0318 13:06:41.931419 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.019058 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.085409 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.087703 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.159676 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.194763 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.227538 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.296113 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.431274 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.453277 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.540329 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.589754 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.598787 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.631069 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.674591 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 13:06:42 crc kubenswrapper[4912]: I0318 13:06:42.793402 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.106993 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.152000 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.158602 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.216023 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.335102 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.363842 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.412780 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.435487 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.441000 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.510560 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.520261 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.562155 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.653760 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 13:06:43 crc kubenswrapper[4912]: I0318 13:06:43.961340 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.016077 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.058028 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.297918 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.313271 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.338255 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.390787 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.586472 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.709022 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.754125 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.758718 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.812228 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.827778 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.846969 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.849841 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.939574 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.954148 4912 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960435 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960493 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b","openshift-controller-manager/controller-manager-758896dd6-55gnf","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 13:06:44 crc kubenswrapper[4912]: E0318 13:06:44.960695 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" containerName="installer" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960711 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" containerName="installer" Mar 18 13:06:44 crc kubenswrapper[4912]: E0318 13:06:44.960723 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" containerName="oc" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960730 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" containerName="oc" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960817 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" containerName="oc" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960828 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc460f0-40a2-4a51-a361-d29032b64e15" containerName="installer" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960869 4912 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.960889 4912 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="68ba56fd-d94a-442a-a90a-0b540230d3ca" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.961317 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.961854 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.965073 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.965375 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.966896 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.966933 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967004 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.966934 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967440 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967502 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967650 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967740 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.967986 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.968082 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.977672 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.978899 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 13:06:44 crc kubenswrapper[4912]: I0318 13:06:44.990574 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.015432 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.015413177 podStartE2EDuration="16.015413177s" podCreationTimestamp="2026-03-18 13:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:06:45.009055256 +0000 UTC m=+253.468482701" watchObservedRunningTime="2026-03-18 13:06:45.015413177 +0000 UTC m=+253.474840612" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.049261 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.091999 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6df54ff-ee21-4b6b-bab8-86839f9a035c-serving-cert\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092192 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-client-ca\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092235 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a137f3f5-d649-4e59-80a5-0aedb734a766-serving-cert\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092265 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85hh\" (UniqueName: \"kubernetes.io/projected/e6df54ff-ee21-4b6b-bab8-86839f9a035c-kube-api-access-s85hh\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092314 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-config\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092353 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-proxy-ca-bundles\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092380 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-config\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092424 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl776\" (UniqueName: \"kubernetes.io/projected/a137f3f5-d649-4e59-80a5-0aedb734a766-kube-api-access-rl776\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.092448 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-client-ca\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.103950 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.183776 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6df54ff-ee21-4b6b-bab8-86839f9a035c-serving-cert\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193691 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-client-ca\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193733 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a137f3f5-d649-4e59-80a5-0aedb734a766-serving-cert\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193760 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85hh\" (UniqueName: \"kubernetes.io/projected/e6df54ff-ee21-4b6b-bab8-86839f9a035c-kube-api-access-s85hh\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193806 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-config\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193839 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-proxy-ca-bundles\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193863 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-config\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193904 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl776\" (UniqueName: \"kubernetes.io/projected/a137f3f5-d649-4e59-80a5-0aedb734a766-kube-api-access-rl776\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.193926 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-client-ca\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.194950 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-client-ca\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.196581 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-config\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.197675 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-proxy-ca-bundles\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.198760 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6df54ff-ee21-4b6b-bab8-86839f9a035c-client-ca\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.198774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a137f3f5-d649-4e59-80a5-0aedb734a766-config\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.201804 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a137f3f5-d649-4e59-80a5-0aedb734a766-serving-cert\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.201871 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6df54ff-ee21-4b6b-bab8-86839f9a035c-serving-cert\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.215371 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl776\" (UniqueName: \"kubernetes.io/projected/a137f3f5-d649-4e59-80a5-0aedb734a766-kube-api-access-rl776\") pod \"controller-manager-758896dd6-55gnf\" (UID: \"a137f3f5-d649-4e59-80a5-0aedb734a766\") " pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.216696 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85hh\" (UniqueName: \"kubernetes.io/projected/e6df54ff-ee21-4b6b-bab8-86839f9a035c-kube-api-access-s85hh\") pod \"route-controller-manager-76df45d45-cmf7b\" (UID: \"e6df54ff-ee21-4b6b-bab8-86839f9a035c\") " pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.225437 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.297234 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.304709 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.311606 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.499906 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.504836 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.627058 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.815435 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 13:06:45 crc kubenswrapper[4912]: I0318 13:06:45.832362 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.043776 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.070478 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.083252 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.148862 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.199195 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.213655 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.450877 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.525819 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.526888 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.587165 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.781473 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.798405 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.835764 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.904322 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.927461 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.973406 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 13:06:46 crc kubenswrapper[4912]: I0318 13:06:46.989938 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.012883 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.046212 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.135073 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.170201 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.234321 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.235388 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.315222 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.336160 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.418715 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.465518 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.478466 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.499137 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.531593 4912 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.600519 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.634999 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.677247 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.715436 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.731455 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.734940 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b"] Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.739906 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758896dd6-55gnf"] Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.780352 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.816070 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 13:06:47 crc kubenswrapper[4912]: I0318 13:06:47.962745 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.013545 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.050451 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.050815 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.073350 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.082434 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.116887 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.132330 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.190393 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.230159 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.283553 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.386785 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.422910 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.439823 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.466353 4912 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08" Netns:"/var/run/netns/b4e7508f-f603-4895-a756-eabce80a498b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.466439 4912 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08" Netns:"/var/run/netns/b4e7508f-f603-4895-a756-eabce80a498b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.466461 4912 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08" Netns:"/var/run/netns/b4e7508f-f603-4895-a756-eabce80a498b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.466527 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager(e6df54ff-ee21-4b6b-bab8-86839f9a035c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager(e6df54ff-ee21-4b6b-bab8-86839f9a035c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08\\\" Netns:\\\"/var/run/netns/b4e7508f-f603-4895-a756-eabce80a498b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=6e1292ac673f511fb16de7dd17ea1f55161b9b262e499c3107621b7e4fa1cd08;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod \\\"route-controller-manager-76df45d45-cmf7b\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.501673 4912 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34" Netns:"/var/run/netns/99f60d28-5781-4abf-a40f-37f791d88abf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.501751 4912 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34" Netns:"/var/run/netns/99f60d28-5781-4abf-a40f-37f791d88abf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.501772 4912 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 13:06:48 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34" Netns:"/var/run/netns/99f60d28-5781-4abf-a40f-37f791d88abf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:48 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:48 crc kubenswrapper[4912]: > pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:48 crc kubenswrapper[4912]: E0318 13:06:48.501831 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-758896dd6-55gnf_openshift-controller-manager(a137f3f5-d649-4e59-80a5-0aedb734a766)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-758896dd6-55gnf_openshift-controller-manager(a137f3f5-d649-4e59-80a5-0aedb734a766)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34\\\" Netns:\\\"/var/run/netns/99f60d28-5781-4abf-a40f-37f791d88abf\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=e8b81ce7261b8c3d47cc9da0009ead8435451f6d22262f9a3be69795be740c34;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod \\\"controller-manager-758896dd6-55gnf\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.507891 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.511480 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.532094 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.561591 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.607718 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.626553 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.684715 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.704135 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.739961 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.794377 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.795973 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.807190 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 13:06:48 crc kubenswrapper[4912]: I0318 13:06:48.990253 4912 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.002573 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.084843 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.088624 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.210417 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.210684 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.210970 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.211293 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.215772 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.225993 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.270864 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.331113 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.333413 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.358576 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.451093 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.462724 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.519179 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.533883 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.548105 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.609589 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.739732 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.903505 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.944220 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 13:06:49 crc kubenswrapper[4912]: I0318 13:06:49.953123 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.025202 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.037902 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.073725 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.075718 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.077689 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.284100 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.286231 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.315401 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.341809 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.559481 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.751301 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 13:06:50 crc kubenswrapper[4912]: I0318 13:06:50.819737 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.066976 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.189654 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.223360 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.233226 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.257561 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.311015 4912 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.311473 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09" gracePeriod=5 Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.317635 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.400520 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.448559 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.459088 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.507125 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.515288 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.519976 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.578882 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.708456 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.710250 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.719795 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.847727 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 13:06:51 crc kubenswrapper[4912]: I0318 13:06:51.905027 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.185123 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.244542 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.269202 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.430623 4912 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb" Netns:"/var/run/netns/084aad24-fd08-4581-93d5-8b05afbbc6ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.430732 4912 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb" Netns:"/var/run/netns/084aad24-fd08-4581-93d5-8b05afbbc6ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.430757 4912 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb" Netns:"/var/run/netns/084aad24-fd08-4581-93d5-8b05afbbc6ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod "route-controller-manager-76df45d45-cmf7b" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.430823 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager(e6df54ff-ee21-4b6b-bab8-86839f9a035c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager(e6df54ff-ee21-4b6b-bab8-86839f9a035c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-76df45d45-cmf7b_openshift-route-controller-manager_e6df54ff-ee21-4b6b-bab8-86839f9a035c_0(ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb): error adding pod openshift-route-controller-manager_route-controller-manager-76df45d45-cmf7b to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb\\\" Netns:\\\"/var/run/netns/084aad24-fd08-4581-93d5-8b05afbbc6ec\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-76df45d45-cmf7b;K8S_POD_INFRA_CONTAINER_ID=ed5a8a97a695caf54e322dff7b3a25b8f9a3a556d7def6c38add56a0cce169bb;K8S_POD_UID=e6df54ff-ee21-4b6b-bab8-86839f9a035c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b/e6df54ff-ee21-4b6b-bab8-86839f9a035c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-76df45d45-cmf7b in out of cluster comm: pod \\\"route-controller-manager-76df45d45-cmf7b\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.455105 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.459071 4912 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0" Netns:"/var/run/netns/5044902c-bdc2-40a8-b208-2a50dc26a9f9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.459142 4912 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0" Netns:"/var/run/netns/5044902c-bdc2-40a8-b208-2a50dc26a9f9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.459162 4912 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 13:06:52 crc kubenswrapper[4912]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0" Netns:"/var/run/netns/5044902c-bdc2-40a8-b208-2a50dc26a9f9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod "controller-manager-758896dd6-55gnf" not found Mar 18 13:06:52 crc kubenswrapper[4912]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 13:06:52 crc kubenswrapper[4912]: > pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:06:52 crc kubenswrapper[4912]: E0318 13:06:52.459219 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-758896dd6-55gnf_openshift-controller-manager(a137f3f5-d649-4e59-80a5-0aedb734a766)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-758896dd6-55gnf_openshift-controller-manager(a137f3f5-d649-4e59-80a5-0aedb734a766)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-758896dd6-55gnf_openshift-controller-manager_a137f3f5-d649-4e59-80a5-0aedb734a766_0(50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0): error adding pod openshift-controller-manager_controller-manager-758896dd6-55gnf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0\\\" Netns:\\\"/var/run/netns/5044902c-bdc2-40a8-b208-2a50dc26a9f9\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-758896dd6-55gnf;K8S_POD_INFRA_CONTAINER_ID=50b7c8e2b40adbcfe360ff54a9d8693d51a81a5f04cd0fe9c90cae7bd12657b0;K8S_POD_UID=a137f3f5-d649-4e59-80a5-0aedb734a766\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-758896dd6-55gnf] networking: Multus: [openshift-controller-manager/controller-manager-758896dd6-55gnf/a137f3f5-d649-4e59-80a5-0aedb734a766]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-758896dd6-55gnf in out of cluster comm: pod \\\"controller-manager-758896dd6-55gnf\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.490307 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.498582 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.511160 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.521488 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.528311 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.579639 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.791672 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 13:06:52 crc kubenswrapper[4912]: I0318 13:06:52.961939 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.077699 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.258603 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.511843 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.559768 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.810496 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 13:06:53 crc kubenswrapper[4912]: I0318 13:06:53.989471 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.032301 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.039136 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.075489 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.131381 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.152249 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.235462 4912 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.355486 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.485454 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.634441 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.637873 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.801884 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.863535 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.963852 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 13:06:54 crc kubenswrapper[4912]: I0318 13:06:54.965542 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.117028 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.313138 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.406738 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.509297 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.662908 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.734893 4912 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 13:06:55 crc kubenswrapper[4912]: I0318 13:06:55.809160 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.129502 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.451221 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.451326 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579246 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579368 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579422 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579446 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579543 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579563 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579616 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579717 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.579908 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.580563 4912 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.580604 4912 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.580620 4912 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.580634 4912 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.591621 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.682159 4912 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:56 crc kubenswrapper[4912]: I0318 13:06:56.983311 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.070615 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.258767 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.258824 4912 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09" exitCode=137 Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.258871 4912 scope.go:117] "RemoveContainer" containerID="f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.258995 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.293106 4912 scope.go:117] "RemoveContainer" containerID="f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09" Mar 18 13:06:57 crc kubenswrapper[4912]: E0318 13:06:57.294180 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09\": container with ID starting with f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09 not found: ID does not exist" containerID="f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.294237 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09"} err="failed to get container status \"f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09\": rpc error: code = NotFound desc = could not find container \"f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09\": container with ID starting with f532bec9977ad64e424c4de0372ef1f13eedf884d3015dca51a7344ede88ce09 not found: ID does not exist" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.653270 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 13:06:57 crc kubenswrapper[4912]: I0318 13:06:57.673539 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 13:06:58 crc kubenswrapper[4912]: I0318 13:06:58.243154 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.227247 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.227312 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.228241 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.228256 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.494669 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b"] Mar 18 13:07:05 crc kubenswrapper[4912]: I0318 13:07:05.661641 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758896dd6-55gnf"] Mar 18 13:07:05 crc kubenswrapper[4912]: W0318 13:07:05.665167 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda137f3f5_d649_4e59_80a5_0aedb734a766.slice/crio-150fa25a0a5421b922a9dc3575b4895581eb62fb42c65f00bf07ed6ef21d54b3 WatchSource:0}: Error finding container 150fa25a0a5421b922a9dc3575b4895581eb62fb42c65f00bf07ed6ef21d54b3: Status 404 returned error can't find the container with id 150fa25a0a5421b922a9dc3575b4895581eb62fb42c65f00bf07ed6ef21d54b3 Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.319515 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" event={"ID":"e6df54ff-ee21-4b6b-bab8-86839f9a035c","Type":"ContainerStarted","Data":"4317cf97fdfbf35b04a793d8a339ef489d493e75a32877049563389b71ba16e5"} Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.319610 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" event={"ID":"e6df54ff-ee21-4b6b-bab8-86839f9a035c","Type":"ContainerStarted","Data":"d96c27ec84268b03c1d2de585078b117b6b81b5923bca55eb2817bea10ece286"} Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.320561 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.320953 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" event={"ID":"a137f3f5-d649-4e59-80a5-0aedb734a766","Type":"ContainerStarted","Data":"7dae4616235a902ccd3a026d9bf2db75cb2b0d8e0a9ceba0743173e72fddbb0e"} Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.320973 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" event={"ID":"a137f3f5-d649-4e59-80a5-0aedb734a766","Type":"ContainerStarted","Data":"150fa25a0a5421b922a9dc3575b4895581eb62fb42c65f00bf07ed6ef21d54b3"} Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.321340 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.325115 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.332112 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.347180 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podStartSLOduration=60.347157393 podStartE2EDuration="1m0.347157393s" podCreationTimestamp="2026-03-18 13:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:07:06.343699532 +0000 UTC m=+274.803126977" watchObservedRunningTime="2026-03-18 13:07:06.347157393 +0000 UTC m=+274.806584838" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.396259 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podStartSLOduration=60.396239353 podStartE2EDuration="1m0.396239353s" podCreationTimestamp="2026-03-18 13:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:07:06.391623869 +0000 UTC m=+274.851051304" watchObservedRunningTime="2026-03-18 13:07:06.396239353 +0000 UTC m=+274.855666778" Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.998676 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:07:06 crc kubenswrapper[4912]: I0318 13:07:06.999279 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:36.999526 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.001952 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.002166 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.003195 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.003431 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614" gracePeriod=600 Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.027460 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-d8c92"] Mar 18 13:07:37 crc kubenswrapper[4912]: E0318 13:07:37.027794 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.027812 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.027955 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.028615 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.053748 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-d8c92"] Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172681 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-tls\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172759 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-bound-sa-token\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172848 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172873 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-certificates\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172908 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-trusted-ca\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172947 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.172997 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.173071 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44hqz\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-kube-api-access-44hqz\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.200730 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.274951 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-tls\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275009 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-bound-sa-token\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275072 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275099 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-certificates\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275129 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-trusted-ca\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275159 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275190 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44hqz\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-kube-api-access-44hqz\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.275784 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.276807 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-certificates\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.277054 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-trusted-ca\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.286778 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.286992 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-registry-tls\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.292400 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-bound-sa-token\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.292827 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44hqz\" (UniqueName: \"kubernetes.io/projected/065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b-kube-api-access-44hqz\") pod \"image-registry-66df7c8f76-d8c92\" (UID: \"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.350900 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.541144 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614" exitCode=0 Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.541590 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614"} Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.541631 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242"} Mar 18 13:07:37 crc kubenswrapper[4912]: I0318 13:07:37.776511 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-d8c92"] Mar 18 13:07:37 crc kubenswrapper[4912]: W0318 13:07:37.783487 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065eeb41_b1d9_4ea8_9f4f_675a5bce6c3b.slice/crio-8df5203361206da87d74b197bfef91b32b9922f35a15749281c886b0fc83eadd WatchSource:0}: Error finding container 8df5203361206da87d74b197bfef91b32b9922f35a15749281c886b0fc83eadd: Status 404 returned error can't find the container with id 8df5203361206da87d74b197bfef91b32b9922f35a15749281c886b0fc83eadd Mar 18 13:07:38 crc kubenswrapper[4912]: I0318 13:07:38.549266 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" event={"ID":"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b","Type":"ContainerStarted","Data":"db7d27dc0da94b29ceb120a63bec104f5450be3c2110d479cff1dc96c0556de1"} Mar 18 13:07:38 crc kubenswrapper[4912]: I0318 13:07:38.549723 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" event={"ID":"065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b","Type":"ContainerStarted","Data":"8df5203361206da87d74b197bfef91b32b9922f35a15749281c886b0fc83eadd"} Mar 18 13:07:38 crc kubenswrapper[4912]: I0318 13:07:38.549748 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:38 crc kubenswrapper[4912]: I0318 13:07:38.571252 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" podStartSLOduration=1.571221271 podStartE2EDuration="1.571221271s" podCreationTimestamp="2026-03-18 13:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:07:38.568889733 +0000 UTC m=+307.028317178" watchObservedRunningTime="2026-03-18 13:07:38.571221271 +0000 UTC m=+307.030648696" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.642442 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.643773 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6ggq" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="registry-server" containerID="cri-o://52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" gracePeriod=30 Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.661931 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.662268 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9dkt" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="registry-server" containerID="cri-o://0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" gracePeriod=30 Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.677229 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.677495 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" containerID="cri-o://01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b" gracePeriod=30 Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.685183 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.685468 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-65p9d" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="registry-server" containerID="cri-o://71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5" gracePeriod=30 Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.704861 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nw2vt"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.705800 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.712635 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.713218 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2x6x" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="registry-server" containerID="cri-o://b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82" gracePeriod=30 Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.720098 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nw2vt"] Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.893254 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.893827 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tjt2\" (UniqueName: \"kubernetes.io/projected/70dff85c-f45b-431d-83ad-3b7802b15cd3-kube-api-access-8tjt2\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.893882 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.994911 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tjt2\" (UniqueName: \"kubernetes.io/projected/70dff85c-f45b-431d-83ad-3b7802b15cd3-kube-api-access-8tjt2\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.994985 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.995026 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:55 crc kubenswrapper[4912]: I0318 13:07:55.997469 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.004397 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/70dff85c-f45b-431d-83ad-3b7802b15cd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.012372 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tjt2\" (UniqueName: \"kubernetes.io/projected/70dff85c-f45b-431d-83ad-3b7802b15cd3-kube-api-access-8tjt2\") pod \"marketplace-operator-79b997595-nw2vt\" (UID: \"70dff85c-f45b-431d-83ad-3b7802b15cd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.027291 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.245748 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nw2vt"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.472458 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 is running failed: container process not found" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.474137 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 is running failed: container process not found" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.474861 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 is running failed: container process not found" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.474941 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-k6ggq" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="registry-server" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.505490 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.555810 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.575697 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.607120 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics\") pod \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.607228 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca\") pod \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.607300 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqrgj\" (UniqueName: \"kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj\") pod \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\" (UID: \"61f97d4c-a7a2-4d3c-bb11-a397c93efbad\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.611067 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "61f97d4c-a7a2-4d3c-bb11-a397c93efbad" (UID: "61f97d4c-a7a2-4d3c-bb11-a397c93efbad"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.613564 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj" (OuterVolumeSpecName: "kube-api-access-xqrgj") pod "61f97d4c-a7a2-4d3c-bb11-a397c93efbad" (UID: "61f97d4c-a7a2-4d3c-bb11-a397c93efbad"). InnerVolumeSpecName "kube-api-access-xqrgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.615974 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "61f97d4c-a7a2-4d3c-bb11-a397c93efbad" (UID: "61f97d4c-a7a2-4d3c-bb11-a397c93efbad"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.627002 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.674471 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550 is running failed: container process not found" containerID="0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.674877 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550 is running failed: container process not found" containerID="0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.675312 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550 is running failed: container process not found" containerID="0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.675363 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-r9dkt" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="registry-server" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.685548 4912 generic.go:334] "Generic (PLEG): container finished" podID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerID="71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.685634 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65p9d" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.685622 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerDied","Data":"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.685738 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65p9d" event={"ID":"5201881b-c2ba-46b7-aeae-62df63a255e8","Type":"ContainerDied","Data":"315096c53f073fd8224bcb2f7aec68350379fe0f5a4deb1179da6b4cae22d818"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.685765 4912 scope.go:117] "RemoveContainer" containerID="71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.691881 4912 generic.go:334] "Generic (PLEG): container finished" podID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerID="0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.691950 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerDied","Data":"0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.696248 4912 generic.go:334] "Generic (PLEG): container finished" podID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.696312 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerDied","Data":"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.696356 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6ggq" event={"ID":"aa54da71-3eed-40ca-a608-43d7f9273e80","Type":"ContainerDied","Data":"892c63f70ffdb84ca74f0ffbd1f49041e818f0194ff1762d06d47cb4d8190ed1"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.696458 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6ggq" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.703849 4912 generic.go:334] "Generic (PLEG): container finished" podID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerID="b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.703992 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerDied","Data":"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.704064 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6x" event={"ID":"46bfbc2d-eb99-4316-a9cc-be875edee92e","Type":"ContainerDied","Data":"05f93d4453d99ee553eb4c01ac2655bc8fa0b2428727af6a75bbd1715a73badf"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.704167 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6x" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.705933 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708344 4912 scope.go:117] "RemoveContainer" containerID="2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708492 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5sc\" (UniqueName: \"kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc\") pod \"5201881b-c2ba-46b7-aeae-62df63a255e8\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708647 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities\") pod \"5201881b-c2ba-46b7-aeae-62df63a255e8\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708704 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities\") pod \"46bfbc2d-eb99-4316-a9cc-be875edee92e\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708754 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content\") pod \"5201881b-c2ba-46b7-aeae-62df63a255e8\" (UID: \"5201881b-c2ba-46b7-aeae-62df63a255e8\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708813 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvmvb\" (UniqueName: \"kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb\") pod \"46bfbc2d-eb99-4316-a9cc-be875edee92e\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.708838 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content\") pod \"46bfbc2d-eb99-4316-a9cc-be875edee92e\" (UID: \"46bfbc2d-eb99-4316-a9cc-be875edee92e\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.709127 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqrgj\" (UniqueName: \"kubernetes.io/projected/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-kube-api-access-xqrgj\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.709144 4912 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.709156 4912 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61f97d4c-a7a2-4d3c-bb11-a397c93efbad-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.711160 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" event={"ID":"70dff85c-f45b-431d-83ad-3b7802b15cd3","Type":"ContainerStarted","Data":"35644ca3ea300a45f71acaf864f05e24c13f97e71c4bcb32a2dc2b9480349f30"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.711225 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" event={"ID":"70dff85c-f45b-431d-83ad-3b7802b15cd3","Type":"ContainerStarted","Data":"7334842c774e416c9d7cbcca89937e084532111dffae49e717bc2ec4e9b81359"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.711319 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities" (OuterVolumeSpecName: "utilities") pod "46bfbc2d-eb99-4316-a9cc-be875edee92e" (UID: "46bfbc2d-eb99-4316-a9cc-be875edee92e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.711526 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.711919 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities" (OuterVolumeSpecName: "utilities") pod "5201881b-c2ba-46b7-aeae-62df63a255e8" (UID: "5201881b-c2ba-46b7-aeae-62df63a255e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.713303 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc" (OuterVolumeSpecName: "kube-api-access-kp5sc") pod "5201881b-c2ba-46b7-aeae-62df63a255e8" (UID: "5201881b-c2ba-46b7-aeae-62df63a255e8"). InnerVolumeSpecName "kube-api-access-kp5sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.713615 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb" (OuterVolumeSpecName: "kube-api-access-rvmvb") pod "46bfbc2d-eb99-4316-a9cc-be875edee92e" (UID: "46bfbc2d-eb99-4316-a9cc-be875edee92e"). InnerVolumeSpecName "kube-api-access-rvmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.715144 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nw2vt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" start-of-body= Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.715230 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podUID="70dff85c-f45b-431d-83ad-3b7802b15cd3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.729448 4912 generic.go:334] "Generic (PLEG): container finished" podID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerID="01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.729808 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" event={"ID":"61f97d4c-a7a2-4d3c-bb11-a397c93efbad","Type":"ContainerDied","Data":"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.729851 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" event={"ID":"61f97d4c-a7a2-4d3c-bb11-a397c93efbad","Type":"ContainerDied","Data":"c995453ba382d1cd0de58d55c50f7379ebf9260c3486f05c089ed59311582f94"} Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.729915 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jsbwx" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.749978 4912 scope.go:117] "RemoveContainer" containerID="402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.776089 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podStartSLOduration=1.776066677 podStartE2EDuration="1.776066677s" podCreationTimestamp="2026-03-18 13:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:07:56.774340535 +0000 UTC m=+325.233767980" watchObservedRunningTime="2026-03-18 13:07:56.776066677 +0000 UTC m=+325.235494102" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.791966 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5201881b-c2ba-46b7-aeae-62df63a255e8" (UID: "5201881b-c2ba-46b7-aeae-62df63a255e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.809679 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9csh\" (UniqueName: \"kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh\") pod \"aa54da71-3eed-40ca-a608-43d7f9273e80\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.809850 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn7gs\" (UniqueName: \"kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs\") pod \"797e0d01-0e3c-498f-abe9-5c90c0e53215\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.809900 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities\") pod \"aa54da71-3eed-40ca-a608-43d7f9273e80\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.809953 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities\") pod \"797e0d01-0e3c-498f-abe9-5c90c0e53215\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810069 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content\") pod \"797e0d01-0e3c-498f-abe9-5c90c0e53215\" (UID: \"797e0d01-0e3c-498f-abe9-5c90c0e53215\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810128 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content\") pod \"aa54da71-3eed-40ca-a608-43d7f9273e80\" (UID: \"aa54da71-3eed-40ca-a608-43d7f9273e80\") " Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810517 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5sc\" (UniqueName: \"kubernetes.io/projected/5201881b-c2ba-46b7-aeae-62df63a255e8-kube-api-access-kp5sc\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810545 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810558 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810570 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5201881b-c2ba-46b7-aeae-62df63a255e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.810582 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvmvb\" (UniqueName: \"kubernetes.io/projected/46bfbc2d-eb99-4316-a9cc-be875edee92e-kube-api-access-rvmvb\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.814489 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities" (OuterVolumeSpecName: "utilities") pod "aa54da71-3eed-40ca-a608-43d7f9273e80" (UID: "aa54da71-3eed-40ca-a608-43d7f9273e80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.815776 4912 scope.go:117] "RemoveContainer" containerID="71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.814797 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities" (OuterVolumeSpecName: "utilities") pod "797e0d01-0e3c-498f-abe9-5c90c0e53215" (UID: "797e0d01-0e3c-498f-abe9-5c90c0e53215"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.816007 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh" (OuterVolumeSpecName: "kube-api-access-m9csh") pod "aa54da71-3eed-40ca-a608-43d7f9273e80" (UID: "aa54da71-3eed-40ca-a608-43d7f9273e80"). InnerVolumeSpecName "kube-api-access-m9csh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.818065 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs" (OuterVolumeSpecName: "kube-api-access-mn7gs") pod "797e0d01-0e3c-498f-abe9-5c90c0e53215" (UID: "797e0d01-0e3c-498f-abe9-5c90c0e53215"). InnerVolumeSpecName "kube-api-access-mn7gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.821183 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5\": container with ID starting with 71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5 not found: ID does not exist" containerID="71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.821228 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5"} err="failed to get container status \"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5\": rpc error: code = NotFound desc = could not find container \"71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5\": container with ID starting with 71388427ee7cb8eefb43ad8c993c389d0ebd58320af32b9f8af2141acc3261c5 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.821257 4912 scope.go:117] "RemoveContainer" containerID="2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.827413 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35\": container with ID starting with 2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35 not found: ID does not exist" containerID="2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.827477 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35"} err="failed to get container status \"2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35\": rpc error: code = NotFound desc = could not find container \"2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35\": container with ID starting with 2615d165d60819eb4fd8f1c23bde5a60389c4850974c1553ff0539e16e610d35 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.827501 4912 scope.go:117] "RemoveContainer" containerID="402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.827877 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0\": container with ID starting with 402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0 not found: ID does not exist" containerID="402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.827901 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0"} err="failed to get container status \"402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0\": rpc error: code = NotFound desc = could not find container \"402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0\": container with ID starting with 402228f474f2652f3afcdca785112c4ed929d231b9a0dc8bfb9524804821a6d0 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.827915 4912 scope.go:117] "RemoveContainer" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.852864 4912 scope.go:117] "RemoveContainer" containerID="b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.874538 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.883459 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jsbwx"] Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.893326 4912 scope.go:117] "RemoveContainer" containerID="66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.896463 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "797e0d01-0e3c-498f-abe9-5c90c0e53215" (UID: "797e0d01-0e3c-498f-abe9-5c90c0e53215"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.905264 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa54da71-3eed-40ca-a608-43d7f9273e80" (UID: "aa54da71-3eed-40ca-a608-43d7f9273e80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.909838 4912 scope.go:117] "RemoveContainer" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.910446 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056\": container with ID starting with 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 not found: ID does not exist" containerID="52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.910587 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056"} err="failed to get container status \"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056\": rpc error: code = NotFound desc = could not find container \"52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056\": container with ID starting with 52fd1206b5677fd2fa881c785909f5cfe45469eccba5090537c1807db999f056 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.910680 4912 scope.go:117] "RemoveContainer" containerID="b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.911655 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7\": container with ID starting with b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7 not found: ID does not exist" containerID="b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.911743 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7"} err="failed to get container status \"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7\": rpc error: code = NotFound desc = could not find container \"b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7\": container with ID starting with b0ab7b8fe95010663854387dda4ac208f554cf5b47ca71a546b9c67feaf286a7 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.911824 4912 scope.go:117] "RemoveContainer" containerID="66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.911918 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn7gs\" (UniqueName: \"kubernetes.io/projected/797e0d01-0e3c-498f-abe9-5c90c0e53215-kube-api-access-mn7gs\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.911973 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.911989 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.912000 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797e0d01-0e3c-498f-abe9-5c90c0e53215-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.912013 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa54da71-3eed-40ca-a608-43d7f9273e80-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.912025 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9csh\" (UniqueName: \"kubernetes.io/projected/aa54da71-3eed-40ca-a608-43d7f9273e80-kube-api-access-m9csh\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.912450 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338\": container with ID starting with 66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338 not found: ID does not exist" containerID="66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.912492 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338"} err="failed to get container status \"66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338\": rpc error: code = NotFound desc = could not find container \"66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338\": container with ID starting with 66756a50e49bc6dc4cbeac079b86755ef2688f40328ff59ba22b57ffab07a338 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.912521 4912 scope.go:117] "RemoveContainer" containerID="b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.927207 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46bfbc2d-eb99-4316-a9cc-be875edee92e" (UID: "46bfbc2d-eb99-4316-a9cc-be875edee92e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.934000 4912 scope.go:117] "RemoveContainer" containerID="d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.959869 4912 scope.go:117] "RemoveContainer" containerID="8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.978386 4912 scope.go:117] "RemoveContainer" containerID="b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.980541 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82\": container with ID starting with b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82 not found: ID does not exist" containerID="b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.980607 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82"} err="failed to get container status \"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82\": rpc error: code = NotFound desc = could not find container \"b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82\": container with ID starting with b020b15dfcca000dc8a6419d231ac5cf07e4888f891a1371fbfb5a086a7acc82 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.980656 4912 scope.go:117] "RemoveContainer" containerID="d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.981267 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77\": container with ID starting with d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77 not found: ID does not exist" containerID="d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.981335 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77"} err="failed to get container status \"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77\": rpc error: code = NotFound desc = could not find container \"d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77\": container with ID starting with d4b01c6916a9be0bdbec3c13885931dc4f47ae05a095d07cd9734ddfeff2fd77 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.981373 4912 scope.go:117] "RemoveContainer" containerID="8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.981866 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37\": container with ID starting with 8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37 not found: ID does not exist" containerID="8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.981936 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37"} err="failed to get container status \"8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37\": rpc error: code = NotFound desc = could not find container \"8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37\": container with ID starting with 8dd96e61256941b65ceee59f98f7451da208a82a98f0c54a64a7e59889fcaa37 not found: ID does not exist" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.981978 4912 scope.go:117] "RemoveContainer" containerID="01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.997920 4912 scope.go:117] "RemoveContainer" containerID="01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b" Mar 18 13:07:56 crc kubenswrapper[4912]: E0318 13:07:56.999002 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b\": container with ID starting with 01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b not found: ID does not exist" containerID="01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b" Mar 18 13:07:56 crc kubenswrapper[4912]: I0318 13:07:56.999083 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b"} err="failed to get container status \"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b\": rpc error: code = NotFound desc = could not find container \"01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b\": container with ID starting with 01c82a70ff8e7321ade25fb62d19abd16368b28531cdfefadd3f94a979c3143b not found: ID does not exist" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.015586 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46bfbc2d-eb99-4316-a9cc-be875edee92e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.026450 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.033239 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-65p9d"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.048019 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.052257 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6x"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.065619 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.074429 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6ggq"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.356937 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.418921 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.460664 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-98dk7"] Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461183 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461196 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461204 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461210 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461224 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461232 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461243 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461251 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461268 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461277 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461288 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461295 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461305 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461312 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461323 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461331 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461341 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461348 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461357 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461364 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461392 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461400 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="extract-utilities" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461414 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461424 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: E0318 13:07:57.461434 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461441 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="extract-content" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461553 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" containerName="marketplace-operator" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461569 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461578 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461586 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.461593 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" containerName="registry-server" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.462376 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.464552 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.475975 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98dk7"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.625362 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-catalog-content\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.625502 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-utilities\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.625541 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94t5x\" (UniqueName: \"kubernetes.io/projected/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-kube-api-access-94t5x\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.726764 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-catalog-content\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.726855 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-utilities\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.726884 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94t5x\" (UniqueName: \"kubernetes.io/projected/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-kube-api-access-94t5x\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.727352 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-utilities\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.727352 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-catalog-content\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.737061 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9dkt" event={"ID":"797e0d01-0e3c-498f-abe9-5c90c0e53215","Type":"ContainerDied","Data":"c4e82dfe6ed5e99682af78ab2e76de920895f93dc15cac0c3c2f817f7748307c"} Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.737115 4912 scope.go:117] "RemoveContainer" containerID="0b5f688c8bc02e901735ec085e20a44d3a465498144926d487653f2cf384f550" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.737117 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9dkt" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.747382 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.751478 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94t5x\" (UniqueName: \"kubernetes.io/projected/73cfec7d-c7e6-4beb-9a85-f161c2c7c31a-kube-api-access-94t5x\") pod \"redhat-marketplace-98dk7\" (UID: \"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a\") " pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.755950 4912 scope.go:117] "RemoveContainer" containerID="8121e02888c8a512f9c7f174bbd8b1b6ddd56da564624e9a6c1b2b34eb04703c" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.781836 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.806538 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.813681 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9dkt"] Mar 18 13:07:57 crc kubenswrapper[4912]: I0318 13:07:57.825348 4912 scope.go:117] "RemoveContainer" containerID="e18cd9a9c97e2b3dc207b9ce9f8ad044b620f79fa50d25d7f7cc798b5b19194a" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.210451 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98dk7"] Mar 18 13:07:58 crc kubenswrapper[4912]: W0318 13:07:58.215518 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73cfec7d_c7e6_4beb_9a85_f161c2c7c31a.slice/crio-2ca879f70f2224258795d5d7c6e78d1451a47b1642bdbc25a009f61aeebd84e4 WatchSource:0}: Error finding container 2ca879f70f2224258795d5d7c6e78d1451a47b1642bdbc25a009f61aeebd84e4: Status 404 returned error can't find the container with id 2ca879f70f2224258795d5d7c6e78d1451a47b1642bdbc25a009f61aeebd84e4 Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.252867 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bfbc2d-eb99-4316-a9cc-be875edee92e" path="/var/lib/kubelet/pods/46bfbc2d-eb99-4316-a9cc-be875edee92e/volumes" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.253586 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5201881b-c2ba-46b7-aeae-62df63a255e8" path="/var/lib/kubelet/pods/5201881b-c2ba-46b7-aeae-62df63a255e8/volumes" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.254261 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f97d4c-a7a2-4d3c-bb11-a397c93efbad" path="/var/lib/kubelet/pods/61f97d4c-a7a2-4d3c-bb11-a397c93efbad/volumes" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.255181 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797e0d01-0e3c-498f-abe9-5c90c0e53215" path="/var/lib/kubelet/pods/797e0d01-0e3c-498f-abe9-5c90c0e53215/volumes" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.255821 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa54da71-3eed-40ca-a608-43d7f9273e80" path="/var/lib/kubelet/pods/aa54da71-3eed-40ca-a608-43d7f9273e80/volumes" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.463862 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfv6d"] Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.465962 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.470507 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.473400 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfv6d"] Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.539891 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9q8\" (UniqueName: \"kubernetes.io/projected/4375d78c-761e-4691-9da9-89f56373ea76-kube-api-access-7f9q8\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.539953 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-catalog-content\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.540379 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-utilities\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.641717 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-utilities\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.641796 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9q8\" (UniqueName: \"kubernetes.io/projected/4375d78c-761e-4691-9da9-89f56373ea76-kube-api-access-7f9q8\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.641828 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-catalog-content\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.642444 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-utilities\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.643907 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4375d78c-761e-4691-9da9-89f56373ea76-catalog-content\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.661946 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9q8\" (UniqueName: \"kubernetes.io/projected/4375d78c-761e-4691-9da9-89f56373ea76-kube-api-access-7f9q8\") pod \"certified-operators-sfv6d\" (UID: \"4375d78c-761e-4691-9da9-89f56373ea76\") " pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.752271 4912 generic.go:334] "Generic (PLEG): container finished" podID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerID="b3c99132d52ff9ad6be6392c5d5cdbc3446cb35721dacf743133dd62f6378430" exitCode=0 Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.752357 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerDied","Data":"b3c99132d52ff9ad6be6392c5d5cdbc3446cb35721dacf743133dd62f6378430"} Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.752396 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerStarted","Data":"2ca879f70f2224258795d5d7c6e78d1451a47b1642bdbc25a009f61aeebd84e4"} Mar 18 13:07:58 crc kubenswrapper[4912]: I0318 13:07:58.792266 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.233966 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfv6d"] Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.768414 4912 generic.go:334] "Generic (PLEG): container finished" podID="4375d78c-761e-4691-9da9-89f56373ea76" containerID="9049ad629dfc130623c58a94cbe3287f60ecdd610d2a0f8d1eba6e209d2d83a4" exitCode=0 Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.768587 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfv6d" event={"ID":"4375d78c-761e-4691-9da9-89f56373ea76","Type":"ContainerDied","Data":"9049ad629dfc130623c58a94cbe3287f60ecdd610d2a0f8d1eba6e209d2d83a4"} Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.768907 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfv6d" event={"ID":"4375d78c-761e-4691-9da9-89f56373ea76","Type":"ContainerStarted","Data":"28f26fc765ed2ce8cc5e98d51e81f418c7e5e6dffd20a38a7cad306ccfa8862d"} Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.775547 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerStarted","Data":"fdb3218497358a07a50ae49f5d3d48df59603117cda2f1e0411a3912d5ae9dd1"} Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.860370 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pr4zx"] Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.861571 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.864752 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.872159 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pr4zx"] Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.964008 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-catalog-content\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.964067 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zm2s\" (UniqueName: \"kubernetes.io/projected/b5944127-745d-42f9-83c2-d448435da4c9-kube-api-access-6zm2s\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:07:59 crc kubenswrapper[4912]: I0318 13:07:59.964107 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-utilities\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.066077 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-catalog-content\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.066136 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zm2s\" (UniqueName: \"kubernetes.io/projected/b5944127-745d-42f9-83c2-d448435da4c9-kube-api-access-6zm2s\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.066190 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-utilities\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.066705 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-catalog-content\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.066748 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5944127-745d-42f9-83c2-d448435da4c9-utilities\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.093251 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zm2s\" (UniqueName: \"kubernetes.io/projected/b5944127-745d-42f9-83c2-d448435da4c9-kube-api-access-6zm2s\") pod \"redhat-operators-pr4zx\" (UID: \"b5944127-745d-42f9-83c2-d448435da4c9\") " pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.141061 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563988-s78jq"] Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.141779 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.143865 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.144313 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.145575 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.151209 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-s78jq"] Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.192862 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.270103 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwgr\" (UniqueName: \"kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr\") pod \"auto-csr-approver-29563988-s78jq\" (UID: \"e8efe2a7-7ebf-472a-8c44-3e2f15209acf\") " pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.372137 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwgr\" (UniqueName: \"kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr\") pod \"auto-csr-approver-29563988-s78jq\" (UID: \"e8efe2a7-7ebf-472a-8c44-3e2f15209acf\") " pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.404902 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwgr\" (UniqueName: \"kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr\") pod \"auto-csr-approver-29563988-s78jq\" (UID: \"e8efe2a7-7ebf-472a-8c44-3e2f15209acf\") " pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.409775 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pr4zx"] Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.469609 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.786067 4912 generic.go:334] "Generic (PLEG): container finished" podID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerID="fdb3218497358a07a50ae49f5d3d48df59603117cda2f1e0411a3912d5ae9dd1" exitCode=0 Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.786141 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerDied","Data":"fdb3218497358a07a50ae49f5d3d48df59603117cda2f1e0411a3912d5ae9dd1"} Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.788413 4912 generic.go:334] "Generic (PLEG): container finished" podID="b5944127-745d-42f9-83c2-d448435da4c9" containerID="284423f54c8da104e565af8513e1eebc9f51f17b8c84408787336aac6e6a046f" exitCode=0 Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.788500 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr4zx" event={"ID":"b5944127-745d-42f9-83c2-d448435da4c9","Type":"ContainerDied","Data":"284423f54c8da104e565af8513e1eebc9f51f17b8c84408787336aac6e6a046f"} Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.788531 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr4zx" event={"ID":"b5944127-745d-42f9-83c2-d448435da4c9","Type":"ContainerStarted","Data":"9224bf6e5e728dee56292a1b81c9f4a1ce4e61b79879c0941e75a56bb2724029"} Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.794143 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfv6d" event={"ID":"4375d78c-761e-4691-9da9-89f56373ea76","Type":"ContainerStarted","Data":"183811f19643ed3f2365c40156151ddeda2f18c3a8f6cfa4e29f75409564f2ba"} Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.860767 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6mtn"] Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.862293 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.864388 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.890971 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6mtn"] Mar 18 13:08:00 crc kubenswrapper[4912]: I0318 13:08:00.962622 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-s78jq"] Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.023787 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr26m\" (UniqueName: \"kubernetes.io/projected/be01ffc1-29df-445f-b0e7-6dd0e80c6297-kube-api-access-cr26m\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.023846 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-utilities\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.023937 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-catalog-content\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.125688 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr26m\" (UniqueName: \"kubernetes.io/projected/be01ffc1-29df-445f-b0e7-6dd0e80c6297-kube-api-access-cr26m\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.125755 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-utilities\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.125820 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-catalog-content\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.126339 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-catalog-content\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.126717 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be01ffc1-29df-445f-b0e7-6dd0e80c6297-utilities\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.158491 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr26m\" (UniqueName: \"kubernetes.io/projected/be01ffc1-29df-445f-b0e7-6dd0e80c6297-kube-api-access-cr26m\") pod \"community-operators-g6mtn\" (UID: \"be01ffc1-29df-445f-b0e7-6dd0e80c6297\") " pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.186887 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.602836 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6mtn"] Mar 18 13:08:01 crc kubenswrapper[4912]: W0318 13:08:01.608250 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe01ffc1_29df_445f_b0e7_6dd0e80c6297.slice/crio-0fac52a52092b97ca33d7118317aa32a30d8e271894a020c40c9c8ad433273cc WatchSource:0}: Error finding container 0fac52a52092b97ca33d7118317aa32a30d8e271894a020c40c9c8ad433273cc: Status 404 returned error can't find the container with id 0fac52a52092b97ca33d7118317aa32a30d8e271894a020c40c9c8ad433273cc Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.802062 4912 generic.go:334] "Generic (PLEG): container finished" podID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerID="26b37e07fdcd6fd4a7404819603030d827f98215e26bd0783c4a9533554fc5b8" exitCode=0 Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.802124 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6mtn" event={"ID":"be01ffc1-29df-445f-b0e7-6dd0e80c6297","Type":"ContainerDied","Data":"26b37e07fdcd6fd4a7404819603030d827f98215e26bd0783c4a9533554fc5b8"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.802158 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6mtn" event={"ID":"be01ffc1-29df-445f-b0e7-6dd0e80c6297","Type":"ContainerStarted","Data":"0fac52a52092b97ca33d7118317aa32a30d8e271894a020c40c9c8ad433273cc"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.807362 4912 generic.go:334] "Generic (PLEG): container finished" podID="4375d78c-761e-4691-9da9-89f56373ea76" containerID="183811f19643ed3f2365c40156151ddeda2f18c3a8f6cfa4e29f75409564f2ba" exitCode=0 Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.807517 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfv6d" event={"ID":"4375d78c-761e-4691-9da9-89f56373ea76","Type":"ContainerDied","Data":"183811f19643ed3f2365c40156151ddeda2f18c3a8f6cfa4e29f75409564f2ba"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.813373 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerStarted","Data":"09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.816127 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr4zx" event={"ID":"b5944127-745d-42f9-83c2-d448435da4c9","Type":"ContainerStarted","Data":"d1391b67a6710ba2ef0a06c000525f0c60691f16eff85e1e29714a97898dec81"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.818396 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-s78jq" event={"ID":"e8efe2a7-7ebf-472a-8c44-3e2f15209acf","Type":"ContainerStarted","Data":"010fbe177651d007ace92a0d678dc3598f657d2a4eb63af5338b6c2d6f46a7ec"} Mar 18 13:08:01 crc kubenswrapper[4912]: I0318 13:08:01.892801 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-98dk7" podStartSLOduration=2.320553557 podStartE2EDuration="4.892768318s" podCreationTimestamp="2026-03-18 13:07:57 +0000 UTC" firstStartedPulling="2026-03-18 13:07:58.754587843 +0000 UTC m=+327.214015268" lastFinishedPulling="2026-03-18 13:08:01.326802604 +0000 UTC m=+329.786230029" observedRunningTime="2026-03-18 13:08:01.869239822 +0000 UTC m=+330.328667277" watchObservedRunningTime="2026-03-18 13:08:01.892768318 +0000 UTC m=+330.352195743" Mar 18 13:08:02 crc kubenswrapper[4912]: I0318 13:08:02.830099 4912 generic.go:334] "Generic (PLEG): container finished" podID="b5944127-745d-42f9-83c2-d448435da4c9" containerID="d1391b67a6710ba2ef0a06c000525f0c60691f16eff85e1e29714a97898dec81" exitCode=0 Mar 18 13:08:02 crc kubenswrapper[4912]: I0318 13:08:02.830231 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr4zx" event={"ID":"b5944127-745d-42f9-83c2-d448435da4c9","Type":"ContainerDied","Data":"d1391b67a6710ba2ef0a06c000525f0c60691f16eff85e1e29714a97898dec81"} Mar 18 13:08:02 crc kubenswrapper[4912]: I0318 13:08:02.839856 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-s78jq" event={"ID":"e8efe2a7-7ebf-472a-8c44-3e2f15209acf","Type":"ContainerStarted","Data":"a4feec8dbebc8bdb216579be927254a771ac0d5d393536fb178e4546ceef6aab"} Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.863330 4912 generic.go:334] "Generic (PLEG): container finished" podID="e8efe2a7-7ebf-472a-8c44-3e2f15209acf" containerID="a4feec8dbebc8bdb216579be927254a771ac0d5d393536fb178e4546ceef6aab" exitCode=0 Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.863428 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-s78jq" event={"ID":"e8efe2a7-7ebf-472a-8c44-3e2f15209acf","Type":"ContainerDied","Data":"a4feec8dbebc8bdb216579be927254a771ac0d5d393536fb178e4546ceef6aab"} Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.865952 4912 generic.go:334] "Generic (PLEG): container finished" podID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerID="aeba972f4169bda69257f065a9d106b3630c2264ee72aebceb4ce4ce4006ef52" exitCode=0 Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.865988 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6mtn" event={"ID":"be01ffc1-29df-445f-b0e7-6dd0e80c6297","Type":"ContainerDied","Data":"aeba972f4169bda69257f065a9d106b3630c2264ee72aebceb4ce4ce4006ef52"} Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.872412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfv6d" event={"ID":"4375d78c-761e-4691-9da9-89f56373ea76","Type":"ContainerStarted","Data":"dfac0b78ae023fd8321e9b93693b9d46d15550ece790d8e88d18bcbdd6480a54"} Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.877098 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr4zx" event={"ID":"b5944127-745d-42f9-83c2-d448435da4c9","Type":"ContainerStarted","Data":"bb1a4329b02f5c18ad6e4252251b84c8eef6f4599835260ffdee13f79f95b712"} Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.920103 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pr4zx" podStartSLOduration=2.229773545 podStartE2EDuration="4.920082008s" podCreationTimestamp="2026-03-18 13:07:59 +0000 UTC" firstStartedPulling="2026-03-18 13:08:00.798554543 +0000 UTC m=+329.257981968" lastFinishedPulling="2026-03-18 13:08:03.488863006 +0000 UTC m=+331.948290431" observedRunningTime="2026-03-18 13:08:03.916317675 +0000 UTC m=+332.375745120" watchObservedRunningTime="2026-03-18 13:08:03.920082008 +0000 UTC m=+332.379509433" Mar 18 13:08:03 crc kubenswrapper[4912]: I0318 13:08:03.936758 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfv6d" podStartSLOduration=2.993629403 podStartE2EDuration="5.936734648s" podCreationTimestamp="2026-03-18 13:07:58 +0000 UTC" firstStartedPulling="2026-03-18 13:07:59.771170501 +0000 UTC m=+328.230597926" lastFinishedPulling="2026-03-18 13:08:02.714275746 +0000 UTC m=+331.173703171" observedRunningTime="2026-03-18 13:08:03.932612184 +0000 UTC m=+332.392039619" watchObservedRunningTime="2026-03-18 13:08:03.936734648 +0000 UTC m=+332.396162083" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.162742 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.281601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pwgr\" (UniqueName: \"kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr\") pod \"e8efe2a7-7ebf-472a-8c44-3e2f15209acf\" (UID: \"e8efe2a7-7ebf-472a-8c44-3e2f15209acf\") " Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.287929 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr" (OuterVolumeSpecName: "kube-api-access-2pwgr") pod "e8efe2a7-7ebf-472a-8c44-3e2f15209acf" (UID: "e8efe2a7-7ebf-472a-8c44-3e2f15209acf"). InnerVolumeSpecName "kube-api-access-2pwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.383276 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pwgr\" (UniqueName: \"kubernetes.io/projected/e8efe2a7-7ebf-472a-8c44-3e2f15209acf-kube-api-access-2pwgr\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.885533 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-s78jq" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.885551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-s78jq" event={"ID":"e8efe2a7-7ebf-472a-8c44-3e2f15209acf","Type":"ContainerDied","Data":"010fbe177651d007ace92a0d678dc3598f657d2a4eb63af5338b6c2d6f46a7ec"} Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.885594 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010fbe177651d007ace92a0d678dc3598f657d2a4eb63af5338b6c2d6f46a7ec" Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.890351 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6mtn" event={"ID":"be01ffc1-29df-445f-b0e7-6dd0e80c6297","Type":"ContainerStarted","Data":"0e6f50b7be188c9ddca258b40b40e0ec7708429caa7528dd8e716c6cb882b96c"} Mar 18 13:08:04 crc kubenswrapper[4912]: I0318 13:08:04.919836 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6mtn" podStartSLOduration=2.378558867 podStartE2EDuration="4.91981511s" podCreationTimestamp="2026-03-18 13:08:00 +0000 UTC" firstStartedPulling="2026-03-18 13:08:01.803727468 +0000 UTC m=+330.263154893" lastFinishedPulling="2026-03-18 13:08:04.344983711 +0000 UTC m=+332.804411136" observedRunningTime="2026-03-18 13:08:04.909840981 +0000 UTC m=+333.369268406" watchObservedRunningTime="2026-03-18 13:08:04.91981511 +0000 UTC m=+333.379242535" Mar 18 13:08:07 crc kubenswrapper[4912]: I0318 13:08:07.785728 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:08:07 crc kubenswrapper[4912]: I0318 13:08:07.786254 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:08:07 crc kubenswrapper[4912]: I0318 13:08:07.854394 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:08:07 crc kubenswrapper[4912]: I0318 13:08:07.949611 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 13:08:08 crc kubenswrapper[4912]: I0318 13:08:08.793339 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:08:08 crc kubenswrapper[4912]: I0318 13:08:08.793907 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:08:08 crc kubenswrapper[4912]: I0318 13:08:08.840500 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:08:08 crc kubenswrapper[4912]: I0318 13:08:08.952931 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfv6d" Mar 18 13:08:10 crc kubenswrapper[4912]: I0318 13:08:10.193838 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:10 crc kubenswrapper[4912]: I0318 13:08:10.194028 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:10 crc kubenswrapper[4912]: I0318 13:08:10.250935 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:10 crc kubenswrapper[4912]: I0318 13:08:10.994256 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pr4zx" Mar 18 13:08:11 crc kubenswrapper[4912]: I0318 13:08:11.187821 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:11 crc kubenswrapper[4912]: I0318 13:08:11.187886 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:11 crc kubenswrapper[4912]: I0318 13:08:11.227104 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:11 crc kubenswrapper[4912]: I0318 13:08:11.989829 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6mtn" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.474174 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" podUID="1e116845-2d89-48e3-b832-584be4553fd3" containerName="registry" containerID="cri-o://5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a" gracePeriod=30 Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.880903 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.960740 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.960865 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pwhl\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.960913 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.961200 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.961287 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.961326 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.961382 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.961430 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token\") pod \"1e116845-2d89-48e3-b832-584be4553fd3\" (UID: \"1e116845-2d89-48e3-b832-584be4553fd3\") " Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.962340 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.963719 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.970295 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.970700 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl" (OuterVolumeSpecName: "kube-api-access-9pwhl") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "kube-api-access-9pwhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.971121 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.972956 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.973359 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:08:22 crc kubenswrapper[4912]: I0318 13:08:22.981359 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1e116845-2d89-48e3-b832-584be4553fd3" (UID: "1e116845-2d89-48e3-b832-584be4553fd3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.021156 4912 generic.go:334] "Generic (PLEG): container finished" podID="1e116845-2d89-48e3-b832-584be4553fd3" containerID="5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a" exitCode=0 Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.021233 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" event={"ID":"1e116845-2d89-48e3-b832-584be4553fd3","Type":"ContainerDied","Data":"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a"} Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.021283 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.021314 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sq25z" event={"ID":"1e116845-2d89-48e3-b832-584be4553fd3","Type":"ContainerDied","Data":"5cd768257b5a38310177368d47ef1a10ad63fc3f1724fb425a036417d8187552"} Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.021337 4912 scope.go:117] "RemoveContainer" containerID="5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.046750 4912 scope.go:117] "RemoveContainer" containerID="5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a" Mar 18 13:08:23 crc kubenswrapper[4912]: E0318 13:08:23.047507 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a\": container with ID starting with 5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a not found: ID does not exist" containerID="5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.047580 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a"} err="failed to get container status \"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a\": rpc error: code = NotFound desc = could not find container \"5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a\": container with ID starting with 5047f3c954cf400744e6106c2500131a816dadb15f71f47860391d1269787f3a not found: ID does not exist" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.061934 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.062932 4912 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e116845-2d89-48e3-b832-584be4553fd3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063172 4912 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e116845-2d89-48e3-b832-584be4553fd3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063187 4912 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063202 4912 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063213 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e116845-2d89-48e3-b832-584be4553fd3-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063223 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pwhl\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-kube-api-access-9pwhl\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.063232 4912 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e116845-2d89-48e3-b832-584be4553fd3-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:23 crc kubenswrapper[4912]: I0318 13:08:23.068344 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sq25z"] Mar 18 13:08:24 crc kubenswrapper[4912]: I0318 13:08:24.235195 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e116845-2d89-48e3-b832-584be4553fd3" path="/var/lib/kubelet/pods/1e116845-2d89-48e3-b832-584be4553fd3/volumes" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.711777 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2"] Mar 18 13:08:49 crc kubenswrapper[4912]: E0318 13:08:49.714183 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e116845-2d89-48e3-b832-584be4553fd3" containerName="registry" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.714304 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e116845-2d89-48e3-b832-584be4553fd3" containerName="registry" Mar 18 13:08:49 crc kubenswrapper[4912]: E0318 13:08:49.714397 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8efe2a7-7ebf-472a-8c44-3e2f15209acf" containerName="oc" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.714483 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8efe2a7-7ebf-472a-8c44-3e2f15209acf" containerName="oc" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.714836 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8efe2a7-7ebf-472a-8c44-3e2f15209acf" containerName="oc" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.714932 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e116845-2d89-48e3-b832-584be4553fd3" containerName="registry" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.715643 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.718565 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.718629 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.718848 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.718920 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.719547 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.719969 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2"] Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.855286 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b502ad98-f0ea-4602-99e4-12fd211ed99e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.855362 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd642\" (UniqueName: \"kubernetes.io/projected/b502ad98-f0ea-4602-99e4-12fd211ed99e-kube-api-access-fd642\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.855397 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b502ad98-f0ea-4602-99e4-12fd211ed99e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.957859 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b502ad98-f0ea-4602-99e4-12fd211ed99e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.957962 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd642\" (UniqueName: \"kubernetes.io/projected/b502ad98-f0ea-4602-99e4-12fd211ed99e-kube-api-access-fd642\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.958016 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b502ad98-f0ea-4602-99e4-12fd211ed99e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.959682 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b502ad98-f0ea-4602-99e4-12fd211ed99e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.967756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b502ad98-f0ea-4602-99e4-12fd211ed99e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:49 crc kubenswrapper[4912]: I0318 13:08:49.975953 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd642\" (UniqueName: \"kubernetes.io/projected/b502ad98-f0ea-4602-99e4-12fd211ed99e-kube-api-access-fd642\") pod \"cluster-monitoring-operator-6d5b84845-zqvn2\" (UID: \"b502ad98-f0ea-4602-99e4-12fd211ed99e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:50 crc kubenswrapper[4912]: I0318 13:08:50.034511 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" Mar 18 13:08:50 crc kubenswrapper[4912]: I0318 13:08:50.249338 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2"] Mar 18 13:08:50 crc kubenswrapper[4912]: I0318 13:08:50.461631 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" event={"ID":"b502ad98-f0ea-4602-99e4-12fd211ed99e","Type":"ContainerStarted","Data":"4c2b80ee66cc26af70b949c6db852105b28785dca949f6fe10f95d8a69e8b136"} Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.345392 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w"] Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.346681 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.350679 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.351082 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qhq4c" Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.355316 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w"] Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.408758 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-84z2w\" (UID: \"59c7d762-a4b8-452d-8824-572aa03c40fd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.478881 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" event={"ID":"b502ad98-f0ea-4602-99e4-12fd211ed99e","Type":"ContainerStarted","Data":"d2317ce682106dc96266be3793c717c08b5281e385e1e66f2990f79ac4761b02"} Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.494978 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zqvn2" podStartSLOduration=2.074182484 podStartE2EDuration="4.494957292s" podCreationTimestamp="2026-03-18 13:08:49 +0000 UTC" firstStartedPulling="2026-03-18 13:08:50.267662996 +0000 UTC m=+378.727090411" lastFinishedPulling="2026-03-18 13:08:52.688437784 +0000 UTC m=+381.147865219" observedRunningTime="2026-03-18 13:08:53.492996153 +0000 UTC m=+381.952423588" watchObservedRunningTime="2026-03-18 13:08:53.494957292 +0000 UTC m=+381.954384717" Mar 18 13:08:53 crc kubenswrapper[4912]: I0318 13:08:53.510354 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-84z2w\" (UID: \"59c7d762-a4b8-452d-8824-572aa03c40fd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:53 crc kubenswrapper[4912]: E0318 13:08:53.510568 4912 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Mar 18 13:08:53 crc kubenswrapper[4912]: E0318 13:08:53.510660 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates podName:59c7d762-a4b8-452d-8824-572aa03c40fd nodeName:}" failed. No retries permitted until 2026-03-18 13:08:54.010633492 +0000 UTC m=+382.470060917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-84z2w" (UID: "59c7d762-a4b8-452d-8824-572aa03c40fd") : secret "prometheus-operator-admission-webhook-tls" not found Mar 18 13:08:54 crc kubenswrapper[4912]: I0318 13:08:54.018077 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-84z2w\" (UID: \"59c7d762-a4b8-452d-8824-572aa03c40fd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:54 crc kubenswrapper[4912]: I0318 13:08:54.028509 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/59c7d762-a4b8-452d-8824-572aa03c40fd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-84z2w\" (UID: \"59c7d762-a4b8-452d-8824-572aa03c40fd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:54 crc kubenswrapper[4912]: I0318 13:08:54.263676 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:55 crc kubenswrapper[4912]: I0318 13:08:54.714812 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w"] Mar 18 13:08:55 crc kubenswrapper[4912]: I0318 13:08:55.492107 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" event={"ID":"59c7d762-a4b8-452d-8824-572aa03c40fd","Type":"ContainerStarted","Data":"c70a5d91ca7ab11963895d05e9e2a99c30a61402cb45969cccfc726c58e5bebc"} Mar 18 13:08:56 crc kubenswrapper[4912]: I0318 13:08:56.502574 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" event={"ID":"59c7d762-a4b8-452d-8824-572aa03c40fd","Type":"ContainerStarted","Data":"31c653dc504df4845b475e0d0527cac9baa2eab4a2325b160a5da64f7439c716"} Mar 18 13:08:56 crc kubenswrapper[4912]: I0318 13:08:56.502876 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:56 crc kubenswrapper[4912]: I0318 13:08:56.510454 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 13:08:56 crc kubenswrapper[4912]: I0318 13:08:56.529299 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podStartSLOduration=2.143501332 podStartE2EDuration="3.529271522s" podCreationTimestamp="2026-03-18 13:08:53 +0000 UTC" firstStartedPulling="2026-03-18 13:08:54.723731312 +0000 UTC m=+383.183158737" lastFinishedPulling="2026-03-18 13:08:56.109501482 +0000 UTC m=+384.568928927" observedRunningTime="2026-03-18 13:08:56.526098496 +0000 UTC m=+384.985525971" watchObservedRunningTime="2026-03-18 13:08:56.529271522 +0000 UTC m=+384.988698967" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.499092 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-slcfx"] Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.500519 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.503281 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.508833 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.508833 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.516573 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-slcfx"] Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.531485 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-m579w" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.575667 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.575773 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dab915-9727-452f-a28f-18005580dc90-metrics-client-ca\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.575819 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvwfk\" (UniqueName: \"kubernetes.io/projected/e5dab915-9727-452f-a28f-18005580dc90-kube-api-access-qvwfk\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.575858 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.677133 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dab915-9727-452f-a28f-18005580dc90-metrics-client-ca\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.677222 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvwfk\" (UniqueName: \"kubernetes.io/projected/e5dab915-9727-452f-a28f-18005580dc90-kube-api-access-qvwfk\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.677252 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.677311 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.678822 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dab915-9727-452f-a28f-18005580dc90-metrics-client-ca\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.690800 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.690800 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5dab915-9727-452f-a28f-18005580dc90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.695650 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvwfk\" (UniqueName: \"kubernetes.io/projected/e5dab915-9727-452f-a28f-18005580dc90-kube-api-access-qvwfk\") pod \"prometheus-operator-db54df47d-slcfx\" (UID: \"e5dab915-9727-452f-a28f-18005580dc90\") " pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:57 crc kubenswrapper[4912]: I0318 13:08:57.820881 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" Mar 18 13:08:58 crc kubenswrapper[4912]: I0318 13:08:58.037064 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-slcfx"] Mar 18 13:08:58 crc kubenswrapper[4912]: I0318 13:08:58.528313 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" event={"ID":"e5dab915-9727-452f-a28f-18005580dc90","Type":"ContainerStarted","Data":"a4871b7ca33768fd88d6cb40c2a432acf6f8e8c10b79a57a8bd0588d625db65f"} Mar 18 13:09:00 crc kubenswrapper[4912]: I0318 13:09:00.551338 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" event={"ID":"e5dab915-9727-452f-a28f-18005580dc90","Type":"ContainerStarted","Data":"560e06cfcc692702389b18c810c7a2ca4d0b2cb9f670b4d3a2d036868c0cdf64"} Mar 18 13:09:00 crc kubenswrapper[4912]: I0318 13:09:00.552144 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" event={"ID":"e5dab915-9727-452f-a28f-18005580dc90","Type":"ContainerStarted","Data":"adb0e04014fc62e0f3347983e7932f1d562564121579e336143b95127240e1ff"} Mar 18 13:09:00 crc kubenswrapper[4912]: I0318 13:09:00.576979 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-slcfx" podStartSLOduration=2.067574657 podStartE2EDuration="3.576955464s" podCreationTimestamp="2026-03-18 13:08:57 +0000 UTC" firstStartedPulling="2026-03-18 13:08:58.045927836 +0000 UTC m=+386.505355261" lastFinishedPulling="2026-03-18 13:08:59.555308643 +0000 UTC m=+388.014736068" observedRunningTime="2026-03-18 13:09:00.570631019 +0000 UTC m=+389.030058494" watchObservedRunningTime="2026-03-18 13:09:00.576955464 +0000 UTC m=+389.036382889" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.860380 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn"] Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.862409 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.866249 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.866301 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-6b4lw" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.866460 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.871987 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-74chf"] Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.873658 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.875763 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn"] Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.876920 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.876943 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-vx85j" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.880237 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.900885 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz"] Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.902095 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.905725 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-pffd4" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.906007 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.906194 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.907276 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.927907 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz"] Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980479 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980550 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dd03a1c9-7f21-434b-8172-40f4bb719e88-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980665 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdz6\" (UniqueName: \"kubernetes.io/projected/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-kube-api-access-6gdz6\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980688 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980725 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgv77\" (UniqueName: \"kubernetes.io/projected/f45f3351-f176-4772-ac3a-88b7838f889f-kube-api-access-rgv77\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980745 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980781 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-wtmp\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980812 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980839 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980864 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-root\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980939 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qp7\" (UniqueName: \"kubernetes.io/projected/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-api-access-l4qp7\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.980973 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-textfile\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981000 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-metrics-client-ca\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981027 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981229 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f45f3351-f176-4772-ac3a-88b7838f889f-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981320 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981348 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:02 crc kubenswrapper[4912]: I0318 13:09:02.981382 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-sys\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082375 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-textfile\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082430 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-metrics-client-ca\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082461 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082488 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f45f3351-f176-4772-ac3a-88b7838f889f-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082517 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082534 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082558 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-sys\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082582 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082599 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dd03a1c9-7f21-434b-8172-40f4bb719e88-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082656 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdz6\" (UniqueName: \"kubernetes.io/projected/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-kube-api-access-6gdz6\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082667 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-sys\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082680 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: E0318 13:09:03.082765 4912 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082773 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgv77\" (UniqueName: \"kubernetes.io/projected/f45f3351-f176-4772-ac3a-88b7838f889f-kube-api-access-rgv77\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: E0318 13:09:03.082837 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls podName:f45f3351-f176-4772-ac3a-88b7838f889f nodeName:}" failed. No retries permitted until 2026-03-18 13:09:03.582815896 +0000 UTC m=+392.042243321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-lvsbn" (UID: "f45f3351-f176-4772-ac3a-88b7838f889f") : secret "openshift-state-metrics-tls" not found Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082883 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-wtmp\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.082958 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083007 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083066 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-root\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083126 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qp7\" (UniqueName: \"kubernetes.io/projected/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-api-access-l4qp7\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083462 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-metrics-client-ca\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083577 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083607 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083633 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f45f3351-f176-4772-ac3a-88b7838f889f-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083688 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-root\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083749 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-wtmp\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.083825 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dd03a1c9-7f21-434b-8172-40f4bb719e88-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: E0318 13:09:03.083989 4912 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 13:09:03 crc kubenswrapper[4912]: E0318 13:09:03.084154 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls podName:077a56fa-6dd9-4761-b8f3-2a13b4d2bc14 nodeName:}" failed. No retries permitted until 2026-03-18 13:09:03.584122362 +0000 UTC m=+392.043549777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls") pod "node-exporter-74chf" (UID: "077a56fa-6dd9-4761-b8f3-2a13b4d2bc14") : secret "node-exporter-tls" not found Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.084312 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-textfile\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.092068 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.092725 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.092932 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.099471 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.106552 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdz6\" (UniqueName: \"kubernetes.io/projected/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-kube-api-access-6gdz6\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.107364 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgv77\" (UniqueName: \"kubernetes.io/projected/f45f3351-f176-4772-ac3a-88b7838f889f-kube-api-access-rgv77\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.109888 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qp7\" (UniqueName: \"kubernetes.io/projected/dd03a1c9-7f21-434b-8172-40f4bb719e88-kube-api-access-l4qp7\") pod \"kube-state-metrics-777cb5bd5d-w8cnz\" (UID: \"dd03a1c9-7f21-434b-8172-40f4bb719e88\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.222170 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.460583 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz"] Mar 18 13:09:03 crc kubenswrapper[4912]: W0318 13:09:03.466827 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd03a1c9_7f21_434b_8172_40f4bb719e88.slice/crio-425b84b316b4cd486a3db8a569c06460ecaf5062a2dd3c321016dd8ce51c7476 WatchSource:0}: Error finding container 425b84b316b4cd486a3db8a569c06460ecaf5062a2dd3c321016dd8ce51c7476: Status 404 returned error can't find the container with id 425b84b316b4cd486a3db8a569c06460ecaf5062a2dd3c321016dd8ce51c7476 Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.594414 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.594489 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.596937 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" event={"ID":"dd03a1c9-7f21-434b-8172-40f4bb719e88","Type":"ContainerStarted","Data":"425b84b316b4cd486a3db8a569c06460ecaf5062a2dd3c321016dd8ce51c7476"} Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.599297 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f45f3351-f176-4772-ac3a-88b7838f889f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-lvsbn\" (UID: \"f45f3351-f176-4772-ac3a-88b7838f889f\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.600083 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/077a56fa-6dd9-4761-b8f3-2a13b4d2bc14-node-exporter-tls\") pod \"node-exporter-74chf\" (UID: \"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14\") " pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.783635 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.802223 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74chf" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.967574 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.983076 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996395 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996467 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996563 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996599 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996470 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.996844 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-sjsjt" Mar 18 13:09:03 crc kubenswrapper[4912]: I0318 13:09:03.997131 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.007298 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.021861 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.025638 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107114 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107203 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6bc\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-kube-api-access-9l6bc\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107470 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107728 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107805 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.107886 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108127 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108158 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108218 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108366 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-web-config\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108426 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-out\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.108467 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.209963 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210021 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210111 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210151 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-web-config\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210173 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-out\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210198 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210244 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210280 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6bc\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-kube-api-access-9l6bc\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210316 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210352 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210378 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.210409 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.212411 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.213111 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.213184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.217494 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.218395 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.218541 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.218836 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-out\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.220760 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.228243 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.231595 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-web-config\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.234764 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.238193 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6bc\" (UniqueName: \"kubernetes.io/projected/d3883c4a-97dd-4635-b2b2-5a9a8c65c72a-kube-api-access-9l6bc\") pod \"alertmanager-main-0\" (UID: \"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: W0318 13:09:04.275106 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf45f3351_f176_4772_ac3a_88b7838f889f.slice/crio-571386bd57f7f385b2c978bca674bc3476e1881f6b83d4969861996cf42f5c4e WatchSource:0}: Error finding container 571386bd57f7f385b2c978bca674bc3476e1881f6b83d4969861996cf42f5c4e: Status 404 returned error can't find the container with id 571386bd57f7f385b2c978bca674bc3476e1881f6b83d4969861996cf42f5c4e Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.276052 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn"] Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.324599 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.557744 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 13:09:04 crc kubenswrapper[4912]: W0318 13:09:04.569651 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3883c4a_97dd_4635_b2b2_5a9a8c65c72a.slice/crio-d3e186110ad384485fa0dacfd78e0b305ff8cedb8675ac20b76cd42fb0216acd WatchSource:0}: Error finding container d3e186110ad384485fa0dacfd78e0b305ff8cedb8675ac20b76cd42fb0216acd: Status 404 returned error can't find the container with id d3e186110ad384485fa0dacfd78e0b305ff8cedb8675ac20b76cd42fb0216acd Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.606855 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" event={"ID":"f45f3351-f176-4772-ac3a-88b7838f889f","Type":"ContainerStarted","Data":"2ee76188ce5aefeeb2a33e98ea4e625abd45ef1abd9234098247f22c48c8711b"} Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.606911 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" event={"ID":"f45f3351-f176-4772-ac3a-88b7838f889f","Type":"ContainerStarted","Data":"bf88df80301db1219cd049aade63cac044af4813d1659b4c956c2e1ae5e00c88"} Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.606930 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" event={"ID":"f45f3351-f176-4772-ac3a-88b7838f889f","Type":"ContainerStarted","Data":"571386bd57f7f385b2c978bca674bc3476e1881f6b83d4969861996cf42f5c4e"} Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.609111 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"d3e186110ad384485fa0dacfd78e0b305ff8cedb8675ac20b76cd42fb0216acd"} Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.610570 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74chf" event={"ID":"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14","Type":"ContainerStarted","Data":"8e79728a82f70ca7e0b56c281508b39fd5882647326049c577fcef0bf35cdb0e"} Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.958117 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-656c486c6f-q6nhd"] Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.961701 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967006 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967020 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-97srdfq8kent6" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967584 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967633 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-wm9c7" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967653 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.967598 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.968029 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 18 13:09:04 crc kubenswrapper[4912]: I0318 13:09:04.974196 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-656c486c6f-q6nhd"] Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130681 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130743 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130825 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-grpc-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130858 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130883 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130917 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f07400ad-8e47-4209-91f0-dcbdbca254b6-metrics-client-ca\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.130964 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whznq\" (UniqueName: \"kubernetes.io/projected/f07400ad-8e47-4209-91f0-dcbdbca254b6-kube-api-access-whznq\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.131008 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232405 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-grpc-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232479 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232522 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f07400ad-8e47-4209-91f0-dcbdbca254b6-metrics-client-ca\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whznq\" (UniqueName: \"kubernetes.io/projected/f07400ad-8e47-4209-91f0-dcbdbca254b6-kube-api-access-whznq\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232682 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232721 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.232756 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.234996 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f07400ad-8e47-4209-91f0-dcbdbca254b6-metrics-client-ca\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.240075 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-grpc-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.240448 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.240689 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.243485 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-tls\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.247324 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.247795 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f07400ad-8e47-4209-91f0-dcbdbca254b6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.256896 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whznq\" (UniqueName: \"kubernetes.io/projected/f07400ad-8e47-4209-91f0-dcbdbca254b6-kube-api-access-whznq\") pod \"thanos-querier-656c486c6f-q6nhd\" (UID: \"f07400ad-8e47-4209-91f0-dcbdbca254b6\") " pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.300753 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:05 crc kubenswrapper[4912]: I0318 13:09:05.900755 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-656c486c6f-q6nhd"] Mar 18 13:09:05 crc kubenswrapper[4912]: W0318 13:09:05.924482 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07400ad_8e47_4209_91f0_dcbdbca254b6.slice/crio-3d0718b68ef92b2664ffea689e7ec6769a73257baecf4ffaee1b871d620ace89 WatchSource:0}: Error finding container 3d0718b68ef92b2664ffea689e7ec6769a73257baecf4ffaee1b871d620ace89: Status 404 returned error can't find the container with id 3d0718b68ef92b2664ffea689e7ec6769a73257baecf4ffaee1b871d620ace89 Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.650577 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" event={"ID":"dd03a1c9-7f21-434b-8172-40f4bb719e88","Type":"ContainerStarted","Data":"d98ba926e93c448ca2bab9986f9b6f927ebc373393469faed7df7266f21475c4"} Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.650970 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" event={"ID":"dd03a1c9-7f21-434b-8172-40f4bb719e88","Type":"ContainerStarted","Data":"8e944318ccfb65838352b8032f089dae7a7445f9d5553edb167389bcde7be008"} Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.650985 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" event={"ID":"dd03a1c9-7f21-434b-8172-40f4bb719e88","Type":"ContainerStarted","Data":"d094c898868a09b4de6d8072cd0416d246c5169d55e79aa955544c30dc676596"} Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.652579 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"3d0718b68ef92b2664ffea689e7ec6769a73257baecf4ffaee1b871d620ace89"} Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.654425 4912 generic.go:334] "Generic (PLEG): container finished" podID="077a56fa-6dd9-4761-b8f3-2a13b4d2bc14" containerID="c98594af6dd6f6638a6f5a380bd20c704ec11cb5204f8c9add5cc935fab4d0b4" exitCode=0 Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.654474 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74chf" event={"ID":"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14","Type":"ContainerDied","Data":"c98594af6dd6f6638a6f5a380bd20c704ec11cb5204f8c9add5cc935fab4d0b4"} Mar 18 13:09:06 crc kubenswrapper[4912]: I0318 13:09:06.681438 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-w8cnz" podStartSLOduration=2.5549935169999998 podStartE2EDuration="4.681415052s" podCreationTimestamp="2026-03-18 13:09:02 +0000 UTC" firstStartedPulling="2026-03-18 13:09:03.470225594 +0000 UTC m=+391.929653019" lastFinishedPulling="2026-03-18 13:09:05.596647109 +0000 UTC m=+394.056074554" observedRunningTime="2026-03-18 13:09:06.678271415 +0000 UTC m=+395.137698880" watchObservedRunningTime="2026-03-18 13:09:06.681415052 +0000 UTC m=+395.140842487" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.715539 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.717307 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.739600 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.874352 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.874600 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.874947 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.875001 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.875293 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.875371 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qrk\" (UniqueName: \"kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.875421 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.976883 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.976952 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.977019 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.977067 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qrk\" (UniqueName: \"kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.977095 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.977125 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.977163 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.978102 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.978125 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.978502 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.978723 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.985300 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.985333 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:07 crc kubenswrapper[4912]: I0318 13:09:07.994829 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qrk\" (UniqueName: \"kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk\") pod \"console-549c97b65-vch92\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.035358 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.278426 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-cdbfdff57-fmktk"] Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.280731 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.284715 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.284911 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.285060 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-mqh6f" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.285547 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.286462 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.286512 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3htdb1lp7kj2i" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.290173 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cdbfdff57-fmktk"] Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.390398 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-client-certs\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.390451 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.390656 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-audit-log\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.390772 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-server-tls\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.391529 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc82k\" (UniqueName: \"kubernetes.io/projected/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-kube-api-access-pc82k\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.391573 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-metrics-server-audit-profiles\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.391703 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-client-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.493731 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-audit-log\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.493786 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-server-tls\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.493809 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc82k\" (UniqueName: \"kubernetes.io/projected/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-kube-api-access-pc82k\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.493833 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-metrics-server-audit-profiles\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.493868 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-client-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.494055 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-client-certs\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.494076 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.494505 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-audit-log\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.495432 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.495546 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-metrics-server-audit-profiles\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.500980 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-client-certs\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.504872 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-client-ca-bundle\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.514834 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc82k\" (UniqueName: \"kubernetes.io/projected/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-kube-api-access-pc82k\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.515480 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd-secret-metrics-server-tls\") pod \"metrics-server-cdbfdff57-fmktk\" (UID: \"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd\") " pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.556776 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:09:08 crc kubenswrapper[4912]: W0318 13:09:08.571314 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db4b715_e205_457b_85fd_4048891c2af6.slice/crio-bcec3eda446345e6650d1231d728faf9fe9e31446398c825b001eb346fb34e63 WatchSource:0}: Error finding container bcec3eda446345e6650d1231d728faf9fe9e31446398c825b001eb346fb34e63: Status 404 returned error can't find the container with id bcec3eda446345e6650d1231d728faf9fe9e31446398c825b001eb346fb34e63 Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.602969 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.642396 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh"] Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.643724 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.646864 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh"] Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.647187 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.647495 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.697090 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" event={"ID":"f45f3351-f176-4772-ac3a-88b7838f889f","Type":"ContainerStarted","Data":"4b30905e2574d65f2e60f706b90e6762755837aa03de5206ad690a5998ae29c1"} Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.700323 4912 generic.go:334] "Generic (PLEG): container finished" podID="d3883c4a-97dd-4635-b2b2-5a9a8c65c72a" containerID="ad84738a6e0a51b56660eed5d3e44d8b0b08bb6400828f75f73d6cba7d351260" exitCode=0 Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.700407 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerDied","Data":"ad84738a6e0a51b56660eed5d3e44d8b0b08bb6400828f75f73d6cba7d351260"} Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.710635 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74chf" event={"ID":"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14","Type":"ContainerStarted","Data":"31fc424149ce2852b50ca8b7b12d8f5d20e5e01b80992531eb173fe4e809b5f6"} Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.710725 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74chf" event={"ID":"077a56fa-6dd9-4761-b8f3-2a13b4d2bc14","Type":"ContainerStarted","Data":"377655c977e4b110594b765089a5bd3ece1e11e6f456b0b3fb3d4cb79038e415"} Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.717122 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-lvsbn" podStartSLOduration=3.89507701 podStartE2EDuration="6.71699533s" podCreationTimestamp="2026-03-18 13:09:02 +0000 UTC" firstStartedPulling="2026-03-18 13:09:04.577765556 +0000 UTC m=+393.037192981" lastFinishedPulling="2026-03-18 13:09:07.399683846 +0000 UTC m=+395.859111301" observedRunningTime="2026-03-18 13:09:08.714335927 +0000 UTC m=+397.173763352" watchObservedRunningTime="2026-03-18 13:09:08.71699533 +0000 UTC m=+397.176422755" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.723614 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-vch92" event={"ID":"5db4b715-e205-457b-85fd-4048891c2af6","Type":"ContainerStarted","Data":"bcec3eda446345e6650d1231d728faf9fe9e31446398c825b001eb346fb34e63"} Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.743089 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-74chf" podStartSLOduration=4.984201824 podStartE2EDuration="6.74306529s" podCreationTimestamp="2026-03-18 13:09:02 +0000 UTC" firstStartedPulling="2026-03-18 13:09:03.84131367 +0000 UTC m=+392.300741095" lastFinishedPulling="2026-03-18 13:09:05.600177126 +0000 UTC m=+394.059604561" observedRunningTime="2026-03-18 13:09:08.7412377 +0000 UTC m=+397.200665125" watchObservedRunningTime="2026-03-18 13:09:08.74306529 +0000 UTC m=+397.202492715" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.802883 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cdcc5deb-7e0f-47e2-be3c-ccf9657de44e-monitoring-plugin-cert\") pod \"monitoring-plugin-7bb9f46ccc-95zvh\" (UID: \"cdcc5deb-7e0f-47e2-be3c-ccf9657de44e\") " pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.901220 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cdbfdff57-fmktk"] Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.905223 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cdcc5deb-7e0f-47e2-be3c-ccf9657de44e-monitoring-plugin-cert\") pod \"monitoring-plugin-7bb9f46ccc-95zvh\" (UID: \"cdcc5deb-7e0f-47e2-be3c-ccf9657de44e\") " pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.915139 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cdcc5deb-7e0f-47e2-be3c-ccf9657de44e-monitoring-plugin-cert\") pod \"monitoring-plugin-7bb9f46ccc-95zvh\" (UID: \"cdcc5deb-7e0f-47e2-be3c-ccf9657de44e\") " pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:08 crc kubenswrapper[4912]: I0318 13:09:08.994167 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.316425 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.319098 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.322257 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-mdvp2" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.322592 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.325707 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.325980 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.327669 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.328394 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.328574 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.328732 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-5hb1sbr9be5ra" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.329006 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.329195 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.329525 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.334196 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.342211 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.344572 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413387 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413486 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413530 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413684 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413715 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4ft5\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-kube-api-access-x4ft5\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413774 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413839 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413878 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413913 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.413944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414006 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414060 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414094 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414121 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414152 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.414175 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.460581 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh"] Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516438 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516545 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516619 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516651 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516708 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516736 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516791 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516819 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.516902 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.517841 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525294 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.523793 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.524690 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525003 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525492 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525563 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525597 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525165 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525628 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525721 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4ft5\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-kube-api-access-x4ft5\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525788 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525817 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525836 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525846 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.525924 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.518324 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.526449 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.526810 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.527336 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544b60c6-45d2-415c-9145-0dc544d78e4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.528847 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.530071 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544b60c6-45d2-415c-9145-0dc544d78e4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.530594 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.538757 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.539073 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.539781 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/544b60c6-45d2-415c-9145-0dc544d78e4a-config\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.545026 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4ft5\" (UniqueName: \"kubernetes.io/projected/544b60c6-45d2-415c-9145-0dc544d78e4a-kube-api-access-x4ft5\") pod \"prometheus-k8s-0\" (UID: \"544b60c6-45d2-415c-9145-0dc544d78e4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.638108 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.734814 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-vch92" event={"ID":"5db4b715-e205-457b-85fd-4048891c2af6","Type":"ContainerStarted","Data":"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788"} Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.739432 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" event={"ID":"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd","Type":"ContainerStarted","Data":"8a62782693948467f8913b5db0d5e1c021991f6002dd1e925283319203d5c099"} Mar 18 13:09:09 crc kubenswrapper[4912]: I0318 13:09:09.759992 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-549c97b65-vch92" podStartSLOduration=2.759965308 podStartE2EDuration="2.759965308s" podCreationTimestamp="2026-03-18 13:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:09:09.754499007 +0000 UTC m=+398.213926452" watchObservedRunningTime="2026-03-18 13:09:09.759965308 +0000 UTC m=+398.219392733" Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.505702 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 13:09:10 crc kubenswrapper[4912]: W0318 13:09:10.522371 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544b60c6_45d2_415c_9145_0dc544d78e4a.slice/crio-17bd00d71ea314d8e027e57c34ace79f8b1e4ebbdc4dfd4e170d4355e3b84e58 WatchSource:0}: Error finding container 17bd00d71ea314d8e027e57c34ace79f8b1e4ebbdc4dfd4e170d4355e3b84e58: Status 404 returned error can't find the container with id 17bd00d71ea314d8e027e57c34ace79f8b1e4ebbdc4dfd4e170d4355e3b84e58 Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.748614 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" event={"ID":"cdcc5deb-7e0f-47e2-be3c-ccf9657de44e","Type":"ContainerStarted","Data":"87d540a0ffdcb85d5c4eac8670db263f06ddce9a5e7d6d08e6799b55a0f4a7f5"} Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.750923 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"00b90b9ef8d9e2b069c758a1d79505a64f2b95988d1ae87a09b59ae8afb90607"} Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.750955 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"17bd00d71ea314d8e027e57c34ace79f8b1e4ebbdc4dfd4e170d4355e3b84e58"} Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.756519 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"97851ce85b828855bcc1b67616db3228bfa5e11dab3c3eed0a93f671d3b3e1a8"} Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.756568 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"87a197eb2e7faf5fe3ad984153e94c6365dded3a69208f5cccff5faf02c96da2"} Mar 18 13:09:10 crc kubenswrapper[4912]: I0318 13:09:10.756581 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"5bfd189e127e6a3395397eda9a60eeb76ffaad42016b361534fff1e76c950994"} Mar 18 13:09:11 crc kubenswrapper[4912]: I0318 13:09:11.774946 4912 generic.go:334] "Generic (PLEG): container finished" podID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerID="00b90b9ef8d9e2b069c758a1d79505a64f2b95988d1ae87a09b59ae8afb90607" exitCode=0 Mar 18 13:09:11 crc kubenswrapper[4912]: I0318 13:09:11.774965 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerDied","Data":"00b90b9ef8d9e2b069c758a1d79505a64f2b95988d1ae87a09b59ae8afb90607"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.798066 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"a08881de9078005c029cf548cdec915dce2feabaf979ec04e56aeccc1b4523f9"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.798882 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"f592dcc89456ef95be9be6acb4ea741e4c2fc9e9bd333a00f81e7fcf344f6536"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.800296 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" event={"ID":"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd","Type":"ContainerStarted","Data":"a8371e44cd45687f253f6ae01e1fd665429fe9f448934a897b3b4aed3a2a1268"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.804606 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"39f953305a6d2b2c3fc835078a2f901f3413afff46f64d5eef711c8c60dc77f3"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.804657 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"0cc52f16e54ff83f645a1fc48818a9aca6170d13f774ea85a6755a17b6685531"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.805894 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" event={"ID":"cdcc5deb-7e0f-47e2-be3c-ccf9657de44e","Type":"ContainerStarted","Data":"74058773159a1e4ae37dbbf19cbeaeaaf816968f47618babb57c6d207b37783e"} Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.806164 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.811476 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 13:09:13 crc kubenswrapper[4912]: I0318 13:09:13.823623 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podStartSLOduration=2.060080435 podStartE2EDuration="5.823590445s" podCreationTimestamp="2026-03-18 13:09:08 +0000 UTC" firstStartedPulling="2026-03-18 13:09:08.916711574 +0000 UTC m=+397.376138999" lastFinishedPulling="2026-03-18 13:09:12.680221584 +0000 UTC m=+401.139649009" observedRunningTime="2026-03-18 13:09:13.819586075 +0000 UTC m=+402.279013520" watchObservedRunningTime="2026-03-18 13:09:13.823590445 +0000 UTC m=+402.283017890" Mar 18 13:09:14 crc kubenswrapper[4912]: I0318 13:09:14.826772 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"205b167f2b87a0dfec47633efa98da1f2f1ae369d2c86190bbc1847e75d7f9df"} Mar 18 13:09:14 crc kubenswrapper[4912]: I0318 13:09:14.832621 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"ece84f78e7438c591755b5ebe1d7a73c1e478df85e5da809f924f49cc4601ac1"} Mar 18 13:09:14 crc kubenswrapper[4912]: I0318 13:09:14.842213 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:14 crc kubenswrapper[4912]: I0318 13:09:14.873260 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" podStartSLOduration=4.254598762 podStartE2EDuration="6.873231379s" podCreationTimestamp="2026-03-18 13:09:08 +0000 UTC" firstStartedPulling="2026-03-18 13:09:10.062173533 +0000 UTC m=+398.521600958" lastFinishedPulling="2026-03-18 13:09:12.68080615 +0000 UTC m=+401.140233575" observedRunningTime="2026-03-18 13:09:13.834275 +0000 UTC m=+402.293702425" watchObservedRunningTime="2026-03-18 13:09:14.873231379 +0000 UTC m=+403.332658814" Mar 18 13:09:14 crc kubenswrapper[4912]: I0318 13:09:14.874232 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" podStartSLOduration=4.067047184 podStartE2EDuration="10.874224926s" podCreationTimestamp="2026-03-18 13:09:04 +0000 UTC" firstStartedPulling="2026-03-18 13:09:05.927715161 +0000 UTC m=+394.387142586" lastFinishedPulling="2026-03-18 13:09:12.734892903 +0000 UTC m=+401.194320328" observedRunningTime="2026-03-18 13:09:14.872579591 +0000 UTC m=+403.332007036" watchObservedRunningTime="2026-03-18 13:09:14.874224926 +0000 UTC m=+403.333652351" Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.329685 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.853316 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" event={"ID":"f07400ad-8e47-4209-91f0-dcbdbca254b6","Type":"ContainerStarted","Data":"3908eb6cc103da61814a9a095a927e4acee81daf752742dd456d6dbe906a8ec3"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.860058 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"813b1a17903fa1a0acd22f06db8a714b898012cbc9a2167eabd964ef65252781"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.860138 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"09eb6b5bfed8242cb6888830e16f35a36bbbc30f2a79b5bbef6506082408352e"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.860153 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d3883c4a-97dd-4635-b2b2-5a9a8c65c72a","Type":"ContainerStarted","Data":"2c2118ebe8358bc8c1101bac9a97af84a5ed8ecef3b381f3f9cfba3a735bdba8"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.865925 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"5e9e80862d14d995da414121bd6fecf94e5a604570616b9a32e710bd30694a5d"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.865989 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"cce205c5a2c635223257ca3ca7b48a95c6891df5ad843f145c07de090b8b1c37"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.866006 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"43d5dfa91fcb4b2f5fb169ba1b073be115cb2a5bb5533aadb8c44a9058e4e052"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.866025 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"71b34b50a2520a7189e3c833f42492f247dbb29a43fde37d20328d7f6815e14b"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.866055 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"544b60c6-45d2-415c-9145-0dc544d78e4a","Type":"ContainerStarted","Data":"a0f494b22f2faffcd6ea431a0749400ef5559167302223d6b8192d4369167a92"} Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.915691 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.807484297 podStartE2EDuration="12.915662263s" podCreationTimestamp="2026-03-18 13:09:03 +0000 UTC" firstStartedPulling="2026-03-18 13:09:04.572211932 +0000 UTC m=+393.031639357" lastFinishedPulling="2026-03-18 13:09:12.680389898 +0000 UTC m=+401.139817323" observedRunningTime="2026-03-18 13:09:15.892441912 +0000 UTC m=+404.351869337" watchObservedRunningTime="2026-03-18 13:09:15.915662263 +0000 UTC m=+404.375089688" Mar 18 13:09:15 crc kubenswrapper[4912]: I0318 13:09:15.958254 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.06406037 podStartE2EDuration="6.958228338s" podCreationTimestamp="2026-03-18 13:09:09 +0000 UTC" firstStartedPulling="2026-03-18 13:09:10.753220245 +0000 UTC m=+399.212647670" lastFinishedPulling="2026-03-18 13:09:14.647388223 +0000 UTC m=+403.106815638" observedRunningTime="2026-03-18 13:09:15.951381219 +0000 UTC m=+404.410808664" watchObservedRunningTime="2026-03-18 13:09:15.958228338 +0000 UTC m=+404.417655763" Mar 18 13:09:18 crc kubenswrapper[4912]: I0318 13:09:18.038340 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:18 crc kubenswrapper[4912]: I0318 13:09:18.040710 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:18 crc kubenswrapper[4912]: I0318 13:09:18.044495 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:18 crc kubenswrapper[4912]: I0318 13:09:18.894101 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:09:18 crc kubenswrapper[4912]: I0318 13:09:18.968851 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:09:19 crc kubenswrapper[4912]: I0318 13:09:19.639213 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:09:28 crc kubenswrapper[4912]: I0318 13:09:28.604256 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:28 crc kubenswrapper[4912]: I0318 13:09:28.605101 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.026112 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vpn9z" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" containerID="cri-o://f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2" gracePeriod=15 Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.416695 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vpn9z_39c7b2b0-6f20-426b-961d-65878696145f/console/0.log" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.417299 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.557854 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.557944 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558018 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558079 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558130 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558296 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558369 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle\") pod \"39c7b2b0-6f20-426b-961d-65878696145f\" (UID: \"39c7b2b0-6f20-426b-961d-65878696145f\") " Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558848 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca" (OuterVolumeSpecName: "service-ca") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.558864 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config" (OuterVolumeSpecName: "console-config") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.559508 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.559547 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.565066 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.565147 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm" (OuterVolumeSpecName: "kube-api-access-r5ktm") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "kube-api-access-r5ktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.578345 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "39c7b2b0-6f20-426b-961d-65878696145f" (UID: "39c7b2b0-6f20-426b-961d-65878696145f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660407 4912 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660483 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660508 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660527 4912 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39c7b2b0-6f20-426b-961d-65878696145f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660550 4912 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660569 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5ktm\" (UniqueName: \"kubernetes.io/projected/39c7b2b0-6f20-426b-961d-65878696145f-kube-api-access-r5ktm\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:44 crc kubenswrapper[4912]: I0318 13:09:44.660590 4912 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39c7b2b0-6f20-426b-961d-65878696145f-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074489 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vpn9z_39c7b2b0-6f20-426b-961d-65878696145f/console/0.log" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074553 4912 generic.go:334] "Generic (PLEG): container finished" podID="39c7b2b0-6f20-426b-961d-65878696145f" containerID="f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2" exitCode=2 Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074627 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpn9z" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074613 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpn9z" event={"ID":"39c7b2b0-6f20-426b-961d-65878696145f","Type":"ContainerDied","Data":"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2"} Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074799 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpn9z" event={"ID":"39c7b2b0-6f20-426b-961d-65878696145f","Type":"ContainerDied","Data":"bbdb52150e48248d30a986b090e5eb1b861fb63ca9ce5cc38829b9e1b40b34dd"} Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.074830 4912 scope.go:117] "RemoveContainer" containerID="f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.112890 4912 scope.go:117] "RemoveContainer" containerID="f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2" Mar 18 13:09:45 crc kubenswrapper[4912]: E0318 13:09:45.114487 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2\": container with ID starting with f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2 not found: ID does not exist" containerID="f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.114643 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2"} err="failed to get container status \"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2\": rpc error: code = NotFound desc = could not find container \"f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2\": container with ID starting with f8247588dff98e419ec7159560bf5228f5dbeb9825776cb4fd3fbf3733681ea2 not found: ID does not exist" Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.120230 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:09:45 crc kubenswrapper[4912]: I0318 13:09:45.123900 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vpn9z"] Mar 18 13:09:46 crc kubenswrapper[4912]: I0318 13:09:46.236879 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c7b2b0-6f20-426b-961d-65878696145f" path="/var/lib/kubelet/pods/39c7b2b0-6f20-426b-961d-65878696145f/volumes" Mar 18 13:09:48 crc kubenswrapper[4912]: I0318 13:09:48.613587 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:09:48 crc kubenswrapper[4912]: I0318 13:09:48.622973 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.135246 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563990-k6q9w"] Mar 18 13:10:00 crc kubenswrapper[4912]: E0318 13:10:00.136240 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.136259 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.136454 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c7b2b0-6f20-426b-961d-65878696145f" containerName="console" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.137103 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.139811 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.144510 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.144532 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.155306 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-k6q9w"] Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.237426 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgd7\" (UniqueName: \"kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7\") pod \"auto-csr-approver-29563990-k6q9w\" (UID: \"a7074714-b00d-490c-83de-bceb48442c19\") " pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.339255 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgd7\" (UniqueName: \"kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7\") pod \"auto-csr-approver-29563990-k6q9w\" (UID: \"a7074714-b00d-490c-83de-bceb48442c19\") " pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.359576 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgd7\" (UniqueName: \"kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7\") pod \"auto-csr-approver-29563990-k6q9w\" (UID: \"a7074714-b00d-490c-83de-bceb48442c19\") " pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.460671 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.696768 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-k6q9w"] Mar 18 13:10:00 crc kubenswrapper[4912]: W0318 13:10:00.704517 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7074714_b00d_490c_83de_bceb48442c19.slice/crio-8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990 WatchSource:0}: Error finding container 8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990: Status 404 returned error can't find the container with id 8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990 Mar 18 13:10:00 crc kubenswrapper[4912]: I0318 13:10:00.708491 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:10:01 crc kubenswrapper[4912]: I0318 13:10:01.191753 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" event={"ID":"a7074714-b00d-490c-83de-bceb48442c19","Type":"ContainerStarted","Data":"8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990"} Mar 18 13:10:03 crc kubenswrapper[4912]: I0318 13:10:03.208400 4912 generic.go:334] "Generic (PLEG): container finished" podID="a7074714-b00d-490c-83de-bceb48442c19" containerID="096d815c6e88750ce17f4f80b15927a1dbe11bb161a0ebc48319486a8a784274" exitCode=0 Mar 18 13:10:03 crc kubenswrapper[4912]: I0318 13:10:03.208758 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" event={"ID":"a7074714-b00d-490c-83de-bceb48442c19","Type":"ContainerDied","Data":"096d815c6e88750ce17f4f80b15927a1dbe11bb161a0ebc48319486a8a784274"} Mar 18 13:10:04 crc kubenswrapper[4912]: I0318 13:10:04.474621 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:04 crc kubenswrapper[4912]: I0318 13:10:04.528941 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgd7\" (UniqueName: \"kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7\") pod \"a7074714-b00d-490c-83de-bceb48442c19\" (UID: \"a7074714-b00d-490c-83de-bceb48442c19\") " Mar 18 13:10:04 crc kubenswrapper[4912]: I0318 13:10:04.543062 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7" (OuterVolumeSpecName: "kube-api-access-qvgd7") pod "a7074714-b00d-490c-83de-bceb48442c19" (UID: "a7074714-b00d-490c-83de-bceb48442c19"). InnerVolumeSpecName "kube-api-access-qvgd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:04 crc kubenswrapper[4912]: I0318 13:10:04.632133 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgd7\" (UniqueName: \"kubernetes.io/projected/a7074714-b00d-490c-83de-bceb48442c19-kube-api-access-qvgd7\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:05 crc kubenswrapper[4912]: I0318 13:10:05.224606 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" event={"ID":"a7074714-b00d-490c-83de-bceb48442c19","Type":"ContainerDied","Data":"8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990"} Mar 18 13:10:05 crc kubenswrapper[4912]: I0318 13:10:05.224671 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d4daa065459339f9ca0bc2e07999c0fc87c1c883410cf33f68745eb9cf88990" Mar 18 13:10:05 crc kubenswrapper[4912]: I0318 13:10:05.224687 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-k6q9w" Mar 18 13:10:06 crc kubenswrapper[4912]: I0318 13:10:06.999391 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:10:07 crc kubenswrapper[4912]: I0318 13:10:06.999966 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:10:09 crc kubenswrapper[4912]: I0318 13:10:09.638305 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:10:09 crc kubenswrapper[4912]: I0318 13:10:09.680150 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:10:10 crc kubenswrapper[4912]: I0318 13:10:10.306024 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.787608 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:10:34 crc kubenswrapper[4912]: E0318 13:10:34.788650 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7074714-b00d-490c-83de-bceb48442c19" containerName="oc" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.788667 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7074714-b00d-490c-83de-bceb48442c19" containerName="oc" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.788803 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7074714-b00d-490c-83de-bceb48442c19" containerName="oc" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.789723 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.807914 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892300 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892387 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892497 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxr8\" (UniqueName: \"kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892539 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892580 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892618 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.892676 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994380 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxr8\" (UniqueName: \"kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994448 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994495 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994536 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994589 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994640 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.994665 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.995923 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.996015 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.996107 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:34 crc kubenswrapper[4912]: I0318 13:10:34.996246 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:35 crc kubenswrapper[4912]: I0318 13:10:35.002911 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:35 crc kubenswrapper[4912]: I0318 13:10:35.003654 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:35 crc kubenswrapper[4912]: I0318 13:10:35.017139 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxr8\" (UniqueName: \"kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8\") pod \"console-647cc7864c-595s8\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:35 crc kubenswrapper[4912]: I0318 13:10:35.106662 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:35 crc kubenswrapper[4912]: I0318 13:10:35.343970 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:10:36 crc kubenswrapper[4912]: I0318 13:10:36.122957 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-647cc7864c-595s8" event={"ID":"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6","Type":"ContainerStarted","Data":"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451"} Mar 18 13:10:36 crc kubenswrapper[4912]: I0318 13:10:36.123081 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-647cc7864c-595s8" event={"ID":"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6","Type":"ContainerStarted","Data":"867e4f1aa443970b3cf77bf62f5f119404bf1af5fcab9a79e0adfb15e74e6348"} Mar 18 13:10:36 crc kubenswrapper[4912]: I0318 13:10:36.149400 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-647cc7864c-595s8" podStartSLOduration=2.149368109 podStartE2EDuration="2.149368109s" podCreationTimestamp="2026-03-18 13:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:10:36.140143881 +0000 UTC m=+484.599571326" watchObservedRunningTime="2026-03-18 13:10:36.149368109 +0000 UTC m=+484.608795544" Mar 18 13:10:36 crc kubenswrapper[4912]: I0318 13:10:36.999112 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:10:36 crc kubenswrapper[4912]: I0318 13:10:36.999217 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:10:45 crc kubenswrapper[4912]: I0318 13:10:45.107426 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:45 crc kubenswrapper[4912]: I0318 13:10:45.108483 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:45 crc kubenswrapper[4912]: I0318 13:10:45.112219 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:45 crc kubenswrapper[4912]: I0318 13:10:45.180835 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:10:45 crc kubenswrapper[4912]: I0318 13:10:45.228968 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:06.999180 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.000083 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.000197 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.000954 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.001013 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242" gracePeriod=600 Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.330505 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242" exitCode=0 Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.330602 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242"} Mar 18 13:11:07 crc kubenswrapper[4912]: I0318 13:11:07.331126 4912 scope.go:117] "RemoveContainer" containerID="acda12973c95ab76d196e7a3ee0eb3d698f14dd5571fede1cd9a2b7308ff3614" Mar 18 13:11:08 crc kubenswrapper[4912]: I0318 13:11:08.342028 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26"} Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.283870 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-549c97b65-vch92" podUID="5db4b715-e205-457b-85fd-4048891c2af6" containerName="console" containerID="cri-o://1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788" gracePeriod=15 Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.622358 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-549c97b65-vch92_5db4b715-e205-457b-85fd-4048891c2af6/console/0.log" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.622965 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.692337 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.692598 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qrk\" (UniqueName: \"kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.692742 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.692828 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.692973 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693124 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693166 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config\") pod \"5db4b715-e205-457b-85fd-4048891c2af6\" (UID: \"5db4b715-e205-457b-85fd-4048891c2af6\") " Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693456 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693490 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config" (OuterVolumeSpecName: "console-config") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693516 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.693649 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca" (OuterVolumeSpecName: "service-ca") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.694526 4912 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.694550 4912 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.694561 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.694571 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db4b715-e205-457b-85fd-4048891c2af6-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.698945 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.699625 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.700299 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk" (OuterVolumeSpecName: "kube-api-access-d8qrk") pod "5db4b715-e205-457b-85fd-4048891c2af6" (UID: "5db4b715-e205-457b-85fd-4048891c2af6"). InnerVolumeSpecName "kube-api-access-d8qrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.794980 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qrk\" (UniqueName: \"kubernetes.io/projected/5db4b715-e205-457b-85fd-4048891c2af6-kube-api-access-d8qrk\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.795029 4912 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:10 crc kubenswrapper[4912]: I0318 13:11:10.795051 4912 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db4b715-e205-457b-85fd-4048891c2af6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362571 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-549c97b65-vch92_5db4b715-e205-457b-85fd-4048891c2af6/console/0.log" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362646 4912 generic.go:334] "Generic (PLEG): container finished" podID="5db4b715-e205-457b-85fd-4048891c2af6" containerID="1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788" exitCode=2 Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362699 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-vch92" event={"ID":"5db4b715-e205-457b-85fd-4048891c2af6","Type":"ContainerDied","Data":"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788"} Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362730 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-vch92" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362770 4912 scope.go:117] "RemoveContainer" containerID="1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.362750 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-vch92" event={"ID":"5db4b715-e205-457b-85fd-4048891c2af6","Type":"ContainerDied","Data":"bcec3eda446345e6650d1231d728faf9fe9e31446398c825b001eb346fb34e63"} Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.390862 4912 scope.go:117] "RemoveContainer" containerID="1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788" Mar 18 13:11:11 crc kubenswrapper[4912]: E0318 13:11:11.392430 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788\": container with ID starting with 1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788 not found: ID does not exist" containerID="1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.392486 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788"} err="failed to get container status \"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788\": rpc error: code = NotFound desc = could not find container \"1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788\": container with ID starting with 1759fa7369eb991659b390f06b1b444197225087d557d4fadb57099b3fb8f788 not found: ID does not exist" Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.399471 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:11:11 crc kubenswrapper[4912]: I0318 13:11:11.403716 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-549c97b65-vch92"] Mar 18 13:11:12 crc kubenswrapper[4912]: I0318 13:11:12.235939 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db4b715-e205-457b-85fd-4048891c2af6" path="/var/lib/kubelet/pods/5db4b715-e205-457b-85fd-4048891c2af6/volumes" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.153612 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563992-b5clf"] Mar 18 13:12:00 crc kubenswrapper[4912]: E0318 13:12:00.154946 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db4b715-e205-457b-85fd-4048891c2af6" containerName="console" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.154975 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db4b715-e205-457b-85fd-4048891c2af6" containerName="console" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.155192 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db4b715-e205-457b-85fd-4048891c2af6" containerName="console" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.155948 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.160933 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.161414 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.162095 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.171972 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-b5clf"] Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.321124 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t44h9\" (UniqueName: \"kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9\") pod \"auto-csr-approver-29563992-b5clf\" (UID: \"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea\") " pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.425217 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t44h9\" (UniqueName: \"kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9\") pod \"auto-csr-approver-29563992-b5clf\" (UID: \"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea\") " pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.456266 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t44h9\" (UniqueName: \"kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9\") pod \"auto-csr-approver-29563992-b5clf\" (UID: \"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea\") " pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.488828 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:00 crc kubenswrapper[4912]: I0318 13:12:00.731642 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-b5clf"] Mar 18 13:12:01 crc kubenswrapper[4912]: I0318 13:12:01.731669 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-b5clf" event={"ID":"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea","Type":"ContainerStarted","Data":"0c101e217bc45f02e1976dac99ce9d8473515f9ce58eb095424f5a10e9352729"} Mar 18 13:12:02 crc kubenswrapper[4912]: I0318 13:12:02.742305 4912 generic.go:334] "Generic (PLEG): container finished" podID="5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" containerID="d0ee852435c27a5a41a5a94aa89dc09cb28a0c542d8334c39a095800bf8c74e7" exitCode=0 Mar 18 13:12:02 crc kubenswrapper[4912]: I0318 13:12:02.742419 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-b5clf" event={"ID":"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea","Type":"ContainerDied","Data":"d0ee852435c27a5a41a5a94aa89dc09cb28a0c542d8334c39a095800bf8c74e7"} Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.010577 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.193484 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t44h9\" (UniqueName: \"kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9\") pod \"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea\" (UID: \"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea\") " Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.202387 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9" (OuterVolumeSpecName: "kube-api-access-t44h9") pod "5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" (UID: "5f8c4d7d-23d6-4ee7-afe7-86295ac42fea"). InnerVolumeSpecName "kube-api-access-t44h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.295964 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t44h9\" (UniqueName: \"kubernetes.io/projected/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea-kube-api-access-t44h9\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.759163 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-b5clf" event={"ID":"5f8c4d7d-23d6-4ee7-afe7-86295ac42fea","Type":"ContainerDied","Data":"0c101e217bc45f02e1976dac99ce9d8473515f9ce58eb095424f5a10e9352729"} Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.759742 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c101e217bc45f02e1976dac99ce9d8473515f9ce58eb095424f5a10e9352729" Mar 18 13:12:04 crc kubenswrapper[4912]: I0318 13:12:04.759333 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-b5clf" Mar 18 13:12:05 crc kubenswrapper[4912]: I0318 13:12:05.106638 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-rl8cg"] Mar 18 13:12:05 crc kubenswrapper[4912]: I0318 13:12:05.118806 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-rl8cg"] Mar 18 13:12:06 crc kubenswrapper[4912]: I0318 13:12:06.239810 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2cbf235-6dbe-4747-b167-89f2593c2ee9" path="/var/lib/kubelet/pods/b2cbf235-6dbe-4747-b167-89f2593c2ee9/volumes" Mar 18 13:12:32 crc kubenswrapper[4912]: I0318 13:12:32.581452 4912 scope.go:117] "RemoveContainer" containerID="56df29823074927f3beabd9ee4c3df2aa40497f476a8b01aa79403e9aacb998e" Mar 18 13:13:37 crc kubenswrapper[4912]: I0318 13:13:36.999937 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:13:37 crc kubenswrapper[4912]: I0318 13:13:37.000867 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.142183 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563994-8tp7p"] Mar 18 13:14:00 crc kubenswrapper[4912]: E0318 13:14:00.144887 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.144905 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.145051 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.145543 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.147857 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-8tp7p"] Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.147909 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.148068 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.148095 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.222551 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cgd\" (UniqueName: \"kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd\") pod \"auto-csr-approver-29563994-8tp7p\" (UID: \"17cc9abc-8326-4767-ba0f-6efee8163b1f\") " pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.324598 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cgd\" (UniqueName: \"kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd\") pod \"auto-csr-approver-29563994-8tp7p\" (UID: \"17cc9abc-8326-4767-ba0f-6efee8163b1f\") " pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.346295 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cgd\" (UniqueName: \"kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd\") pod \"auto-csr-approver-29563994-8tp7p\" (UID: \"17cc9abc-8326-4767-ba0f-6efee8163b1f\") " pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.471446 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:00 crc kubenswrapper[4912]: I0318 13:14:00.909135 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-8tp7p"] Mar 18 13:14:00 crc kubenswrapper[4912]: W0318 13:14:00.920457 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17cc9abc_8326_4767_ba0f_6efee8163b1f.slice/crio-01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc WatchSource:0}: Error finding container 01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc: Status 404 returned error can't find the container with id 01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc Mar 18 13:14:01 crc kubenswrapper[4912]: I0318 13:14:01.592770 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" event={"ID":"17cc9abc-8326-4767-ba0f-6efee8163b1f","Type":"ContainerStarted","Data":"01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc"} Mar 18 13:14:02 crc kubenswrapper[4912]: I0318 13:14:02.600737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" event={"ID":"17cc9abc-8326-4767-ba0f-6efee8163b1f","Type":"ContainerStarted","Data":"672fdc1df7dbc5fc0bdab535714f3ef79f01dbb702b11cd691edee75dcabc545"} Mar 18 13:14:02 crc kubenswrapper[4912]: I0318 13:14:02.634251 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" podStartSLOduration=1.41264802 podStartE2EDuration="2.634218822s" podCreationTimestamp="2026-03-18 13:14:00 +0000 UTC" firstStartedPulling="2026-03-18 13:14:00.922982721 +0000 UTC m=+689.382410146" lastFinishedPulling="2026-03-18 13:14:02.144553493 +0000 UTC m=+690.603980948" observedRunningTime="2026-03-18 13:14:02.621226632 +0000 UTC m=+691.080654067" watchObservedRunningTime="2026-03-18 13:14:02.634218822 +0000 UTC m=+691.093646257" Mar 18 13:14:03 crc kubenswrapper[4912]: I0318 13:14:03.609696 4912 generic.go:334] "Generic (PLEG): container finished" podID="17cc9abc-8326-4767-ba0f-6efee8163b1f" containerID="672fdc1df7dbc5fc0bdab535714f3ef79f01dbb702b11cd691edee75dcabc545" exitCode=0 Mar 18 13:14:03 crc kubenswrapper[4912]: I0318 13:14:03.609744 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" event={"ID":"17cc9abc-8326-4767-ba0f-6efee8163b1f","Type":"ContainerDied","Data":"672fdc1df7dbc5fc0bdab535714f3ef79f01dbb702b11cd691edee75dcabc545"} Mar 18 13:14:04 crc kubenswrapper[4912]: I0318 13:14:04.895272 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.005672 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4cgd\" (UniqueName: \"kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd\") pod \"17cc9abc-8326-4767-ba0f-6efee8163b1f\" (UID: \"17cc9abc-8326-4767-ba0f-6efee8163b1f\") " Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.012448 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd" (OuterVolumeSpecName: "kube-api-access-p4cgd") pod "17cc9abc-8326-4767-ba0f-6efee8163b1f" (UID: "17cc9abc-8326-4767-ba0f-6efee8163b1f"). InnerVolumeSpecName "kube-api-access-p4cgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.107873 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4cgd\" (UniqueName: \"kubernetes.io/projected/17cc9abc-8326-4767-ba0f-6efee8163b1f-kube-api-access-p4cgd\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.318506 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-s78jq"] Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.323476 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-s78jq"] Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.624825 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" event={"ID":"17cc9abc-8326-4767-ba0f-6efee8163b1f","Type":"ContainerDied","Data":"01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc"} Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.624881 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f372971b056ee631641aebead3945a4647e9c8a9bad8446c1c1ce87fd8cfdc" Mar 18 13:14:05 crc kubenswrapper[4912]: I0318 13:14:05.624900 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-8tp7p" Mar 18 13:14:06 crc kubenswrapper[4912]: I0318 13:14:06.238242 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8efe2a7-7ebf-472a-8c44-3e2f15209acf" path="/var/lib/kubelet/pods/e8efe2a7-7ebf-472a-8c44-3e2f15209acf/volumes" Mar 18 13:14:06 crc kubenswrapper[4912]: I0318 13:14:06.998493 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:14:06 crc kubenswrapper[4912]: I0318 13:14:06.998576 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:14:32 crc kubenswrapper[4912]: I0318 13:14:32.661496 4912 scope.go:117] "RemoveContainer" containerID="a4feec8dbebc8bdb216579be927254a771ac0d5d393536fb178e4546ceef6aab" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.046500 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g"] Mar 18 13:14:36 crc kubenswrapper[4912]: E0318 13:14:36.047730 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cc9abc-8326-4767-ba0f-6efee8163b1f" containerName="oc" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.047754 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cc9abc-8326-4767-ba0f-6efee8163b1f" containerName="oc" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.047962 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cc9abc-8326-4767-ba0f-6efee8163b1f" containerName="oc" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.049067 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.052148 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.071895 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g"] Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.144028 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vljh\" (UniqueName: \"kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.144104 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.144177 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.245901 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.246014 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vljh\" (UniqueName: \"kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.246057 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.246556 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.246623 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.269433 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vljh\" (UniqueName: \"kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.364776 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.611239 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g"] Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.850279 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerStarted","Data":"69a6740a18a441d4ee8ee62a27bf3775dca4bd2d3948465917d914ec58811f54"} Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.850326 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerStarted","Data":"699088ef230cc7b1566bf2422ba8fa22730422f0fe489c51570536ca5d5d7031"} Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.998348 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.998447 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.998529 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.999301 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:14:36 crc kubenswrapper[4912]: I0318 13:14:36.999375 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26" gracePeriod=600 Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.859645 4912 generic.go:334] "Generic (PLEG): container finished" podID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerID="69a6740a18a441d4ee8ee62a27bf3775dca4bd2d3948465917d914ec58811f54" exitCode=0 Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.859797 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerDied","Data":"69a6740a18a441d4ee8ee62a27bf3775dca4bd2d3948465917d914ec58811f54"} Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.863556 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26" exitCode=0 Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.863609 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26"} Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.863658 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459"} Mar 18 13:14:37 crc kubenswrapper[4912]: I0318 13:14:37.863682 4912 scope.go:117] "RemoveContainer" containerID="2eae871d72b2861d79999e929dd74343dc3d0189bcb36cebcb51e26fbc951242" Mar 18 13:14:39 crc kubenswrapper[4912]: I0318 13:14:39.885732 4912 generic.go:334] "Generic (PLEG): container finished" podID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerID="57ca63bbc15fa16b8c0327909d02a15f879f607231e9310ecc660329cc700579" exitCode=0 Mar 18 13:14:39 crc kubenswrapper[4912]: I0318 13:14:39.885866 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerDied","Data":"57ca63bbc15fa16b8c0327909d02a15f879f607231e9310ecc660329cc700579"} Mar 18 13:14:40 crc kubenswrapper[4912]: I0318 13:14:40.903933 4912 generic.go:334] "Generic (PLEG): container finished" podID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerID="41913b9494d9d18627f72337e1f2a9960a52631d5d4de6fcf09c836d63e68bd0" exitCode=0 Mar 18 13:14:40 crc kubenswrapper[4912]: I0318 13:14:40.903989 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerDied","Data":"41913b9494d9d18627f72337e1f2a9960a52631d5d4de6fcf09c836d63e68bd0"} Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.246559 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.350421 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vljh\" (UniqueName: \"kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh\") pod \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.350511 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle\") pod \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.350664 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util\") pod \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\" (UID: \"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb\") " Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.354580 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle" (OuterVolumeSpecName: "bundle") pod "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" (UID: "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.359850 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh" (OuterVolumeSpecName: "kube-api-access-4vljh") pod "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" (UID: "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb"). InnerVolumeSpecName "kube-api-access-4vljh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.366971 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util" (OuterVolumeSpecName: "util") pod "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" (UID: "d97da3f2-c9b0-42d1-a5ea-795994d0f5cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.452748 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vljh\" (UniqueName: \"kubernetes.io/projected/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-kube-api-access-4vljh\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.452791 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.452801 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97da3f2-c9b0-42d1-a5ea-795994d0f5cb-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.921825 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" event={"ID":"d97da3f2-c9b0-42d1-a5ea-795994d0f5cb","Type":"ContainerDied","Data":"699088ef230cc7b1566bf2422ba8fa22730422f0fe489c51570536ca5d5d7031"} Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.921877 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699088ef230cc7b1566bf2422ba8fa22730422f0fe489c51570536ca5d5d7031" Mar 18 13:14:42 crc kubenswrapper[4912]: I0318 13:14:42.921953 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g" Mar 18 13:14:47 crc kubenswrapper[4912]: I0318 13:14:47.316474 4912 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.008936 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf"] Mar 18 13:14:55 crc kubenswrapper[4912]: E0318 13:14:55.010068 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="pull" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.010082 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="pull" Mar 18 13:14:55 crc kubenswrapper[4912]: E0318 13:14:55.010099 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="util" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.010106 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="util" Mar 18 13:14:55 crc kubenswrapper[4912]: E0318 13:14:55.010118 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="extract" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.010126 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="extract" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.010252 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97da3f2-c9b0-42d1-a5ea-795994d0f5cb" containerName="extract" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.010724 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.012800 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-h6ck2" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.012851 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.013323 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.087078 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.168274 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dps2p\" (UniqueName: \"kubernetes.io/projected/37808f2f-08d5-432e-8ad6-69ad0b0e573a-kube-api-access-dps2p\") pod \"obo-prometheus-operator-8ff7d675-2r5xf\" (UID: \"37808f2f-08d5-432e-8ad6-69ad0b0e573a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.269850 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dps2p\" (UniqueName: \"kubernetes.io/projected/37808f2f-08d5-432e-8ad6-69ad0b0e573a-kube-api-access-dps2p\") pod \"obo-prometheus-operator-8ff7d675-2r5xf\" (UID: \"37808f2f-08d5-432e-8ad6-69ad0b0e573a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.296665 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dps2p\" (UniqueName: \"kubernetes.io/projected/37808f2f-08d5-432e-8ad6-69ad0b0e573a-kube-api-access-dps2p\") pod \"obo-prometheus-operator-8ff7d675-2r5xf\" (UID: \"37808f2f-08d5-432e-8ad6-69ad0b0e573a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.331589 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.419087 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.420653 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.427683 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.429295 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-sr97v" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.452765 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.478232 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.479990 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.503449 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.577488 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.577581 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.577635 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.577676 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.679289 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.679374 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.679414 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.679450 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.686889 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.699691 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.705536 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-66zt6\" (UID: \"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.706031 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4045f06-e567-4dda-8192-2dbef917a7a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d7879b9-flqsx\" (UID: \"d4045f06-e567-4dda-8192-2dbef917a7a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.746690 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.807222 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.813357 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf"] Mar 18 13:14:55 crc kubenswrapper[4912]: W0318 13:14:55.833529 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37808f2f_08d5_432e_8ad6_69ad0b0e573a.slice/crio-525f3675ce639bd7250f08a5db653065b64e4fe57f099eebe782e34429b2ce23 WatchSource:0}: Error finding container 525f3675ce639bd7250f08a5db653065b64e4fe57f099eebe782e34429b2ce23: Status 404 returned error can't find the container with id 525f3675ce639bd7250f08a5db653065b64e4fe57f099eebe782e34429b2ce23 Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.958075 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lcgrk"] Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.965223 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.972573 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.972734 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-m249t" Mar 18 13:14:55 crc kubenswrapper[4912]: I0318 13:14:55.991332 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lcgrk"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.044306 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" event={"ID":"37808f2f-08d5-432e-8ad6-69ad0b0e573a","Type":"ContainerStarted","Data":"525f3675ce639bd7250f08a5db653065b64e4fe57f099eebe782e34429b2ce23"} Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.086897 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffcc0a7f-efff-4a18-8002-7b33a557293c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.086957 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwfg\" (UniqueName: \"kubernetes.io/projected/ffcc0a7f-efff-4a18-8002-7b33a557293c-kube-api-access-tbwfg\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.130550 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.190730 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffcc0a7f-efff-4a18-8002-7b33a557293c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.190809 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwfg\" (UniqueName: \"kubernetes.io/projected/ffcc0a7f-efff-4a18-8002-7b33a557293c-kube-api-access-tbwfg\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.200220 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffcc0a7f-efff-4a18-8002-7b33a557293c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.211289 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwfg\" (UniqueName: \"kubernetes.io/projected/ffcc0a7f-efff-4a18-8002-7b33a557293c-kube-api-access-tbwfg\") pod \"observability-operator-6dd7dd855f-lcgrk\" (UID: \"ffcc0a7f-efff-4a18-8002-7b33a557293c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.237960 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.334408 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.412049 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-7bb4554dcb-4hc2x"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.412968 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.415443 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.415679 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gsvrb" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.496502 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-webhook-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.497014 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-apiservice-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.497094 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgqzg\" (UniqueName: \"kubernetes.io/projected/b1062176-da75-4c7d-a3fc-b5ecee790973-kube-api-access-zgqzg\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.497142 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1062176-da75-4c7d-a3fc-b5ecee790973-openshift-service-ca\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.520541 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-7bb4554dcb-4hc2x"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.598283 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-apiservice-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.598355 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgqzg\" (UniqueName: \"kubernetes.io/projected/b1062176-da75-4c7d-a3fc-b5ecee790973-kube-api-access-zgqzg\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.598396 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1062176-da75-4c7d-a3fc-b5ecee790973-openshift-service-ca\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.598432 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-webhook-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.600539 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1062176-da75-4c7d-a3fc-b5ecee790973-openshift-service-ca\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.630407 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-webhook-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.633840 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgqzg\" (UniqueName: \"kubernetes.io/projected/b1062176-da75-4c7d-a3fc-b5ecee790973-kube-api-access-zgqzg\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.645491 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1062176-da75-4c7d-a3fc-b5ecee790973-apiservice-cert\") pod \"perses-operator-7bb4554dcb-4hc2x\" (UID: \"b1062176-da75-4c7d-a3fc-b5ecee790973\") " pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.726704 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lcgrk"] Mar 18 13:14:56 crc kubenswrapper[4912]: I0318 13:14:56.730389 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:14:56 crc kubenswrapper[4912]: W0318 13:14:56.733301 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcc0a7f_efff_4a18_8002_7b33a557293c.slice/crio-24a2823d201b64106bf2f03cf3222f8739f2d889ff8d4af30337d37dbddf54e0 WatchSource:0}: Error finding container 24a2823d201b64106bf2f03cf3222f8739f2d889ff8d4af30337d37dbddf54e0: Status 404 returned error can't find the container with id 24a2823d201b64106bf2f03cf3222f8739f2d889ff8d4af30337d37dbddf54e0 Mar 18 13:14:57 crc kubenswrapper[4912]: I0318 13:14:57.053828 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" event={"ID":"ffcc0a7f-efff-4a18-8002-7b33a557293c","Type":"ContainerStarted","Data":"24a2823d201b64106bf2f03cf3222f8739f2d889ff8d4af30337d37dbddf54e0"} Mar 18 13:14:57 crc kubenswrapper[4912]: I0318 13:14:57.057646 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" event={"ID":"d4045f06-e567-4dda-8192-2dbef917a7a0","Type":"ContainerStarted","Data":"b4f481b2447a6a14234474b7f6a8e5652b44702bd0cec94105ed983614f5343c"} Mar 18 13:14:57 crc kubenswrapper[4912]: I0318 13:14:57.059635 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" event={"ID":"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2","Type":"ContainerStarted","Data":"eab359fe1b818d9266c34e1693767136f223f8fcb319e0edac4aebf3aecb94b2"} Mar 18 13:14:57 crc kubenswrapper[4912]: I0318 13:14:57.064137 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-7bb4554dcb-4hc2x"] Mar 18 13:14:57 crc kubenswrapper[4912]: W0318 13:14:57.078925 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1062176_da75_4c7d_a3fc_b5ecee790973.slice/crio-4b17c10eb3a612b9220a54c68a61d3973f4aa60ce94ec11bc6e19c00b6b8f4b7 WatchSource:0}: Error finding container 4b17c10eb3a612b9220a54c68a61d3973f4aa60ce94ec11bc6e19c00b6b8f4b7: Status 404 returned error can't find the container with id 4b17c10eb3a612b9220a54c68a61d3973f4aa60ce94ec11bc6e19c00b6b8f4b7 Mar 18 13:14:58 crc kubenswrapper[4912]: I0318 13:14:58.084954 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" event={"ID":"b1062176-da75-4c7d-a3fc-b5ecee790973","Type":"ContainerStarted","Data":"4b17c10eb3a612b9220a54c68a61d3973f4aa60ce94ec11bc6e19c00b6b8f4b7"} Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.151443 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4"] Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.155360 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.158740 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.161765 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.166410 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4"] Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.245722 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.245900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2f4\" (UniqueName: \"kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.245929 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.348023 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2f4\" (UniqueName: \"kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.348106 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.348155 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.349443 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.356776 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.379018 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2f4\" (UniqueName: \"kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4\") pod \"collect-profiles-29563995-46cl4\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:00 crc kubenswrapper[4912]: I0318 13:15:00.513913 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:01 crc kubenswrapper[4912]: I0318 13:15:01.207865 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4"] Mar 18 13:15:01 crc kubenswrapper[4912]: W0318 13:15:01.232423 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fdc363_52c1_4525_aeaf_146ab5700fb3.slice/crio-6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2 WatchSource:0}: Error finding container 6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2: Status 404 returned error can't find the container with id 6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2 Mar 18 13:15:02 crc kubenswrapper[4912]: I0318 13:15:02.262372 4912 generic.go:334] "Generic (PLEG): container finished" podID="23fdc363-52c1-4525-aeaf-146ab5700fb3" containerID="2b8391e8a5000fd9ad79501cf910594a527cb4f931dfaa1905e8cdf38240ac23" exitCode=0 Mar 18 13:15:02 crc kubenswrapper[4912]: I0318 13:15:02.262705 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" event={"ID":"23fdc363-52c1-4525-aeaf-146ab5700fb3","Type":"ContainerDied","Data":"2b8391e8a5000fd9ad79501cf910594a527cb4f931dfaa1905e8cdf38240ac23"} Mar 18 13:15:02 crc kubenswrapper[4912]: I0318 13:15:02.262744 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" event={"ID":"23fdc363-52c1-4525-aeaf-146ab5700fb3","Type":"ContainerStarted","Data":"6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2"} Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.726872 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.895544 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume\") pod \"23fdc363-52c1-4525-aeaf-146ab5700fb3\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.897854 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume\") pod \"23fdc363-52c1-4525-aeaf-146ab5700fb3\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.897984 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h2f4\" (UniqueName: \"kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4\") pod \"23fdc363-52c1-4525-aeaf-146ab5700fb3\" (UID: \"23fdc363-52c1-4525-aeaf-146ab5700fb3\") " Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.898695 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume" (OuterVolumeSpecName: "config-volume") pod "23fdc363-52c1-4525-aeaf-146ab5700fb3" (UID: "23fdc363-52c1-4525-aeaf-146ab5700fb3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.898902 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fdc363-52c1-4525-aeaf-146ab5700fb3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.904506 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23fdc363-52c1-4525-aeaf-146ab5700fb3" (UID: "23fdc363-52c1-4525-aeaf-146ab5700fb3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:05 crc kubenswrapper[4912]: I0318 13:15:05.907365 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4" (OuterVolumeSpecName: "kube-api-access-5h2f4") pod "23fdc363-52c1-4525-aeaf-146ab5700fb3" (UID: "23fdc363-52c1-4525-aeaf-146ab5700fb3"). InnerVolumeSpecName "kube-api-access-5h2f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:06 crc kubenswrapper[4912]: I0318 13:15:06.000845 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fdc363-52c1-4525-aeaf-146ab5700fb3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:06 crc kubenswrapper[4912]: I0318 13:15:06.000899 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h2f4\" (UniqueName: \"kubernetes.io/projected/23fdc363-52c1-4525-aeaf-146ab5700fb3-kube-api-access-5h2f4\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:06 crc kubenswrapper[4912]: I0318 13:15:06.311667 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" event={"ID":"23fdc363-52c1-4525-aeaf-146ab5700fb3","Type":"ContainerDied","Data":"6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2"} Mar 18 13:15:06 crc kubenswrapper[4912]: I0318 13:15:06.311713 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be152d6bc2f370e9a83130f5bbca460240360d61a4a4fffb62793604a6c01f2" Mar 18 13:15:06 crc kubenswrapper[4912]: I0318 13:15:06.311772 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4" Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.583728 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sns58"] Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585013 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="nbdb" containerID="cri-o://bca229aab67540876b235ab78ea602d1fb804e17f3dd261c91692cad71bc8042" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585144 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c4650c42543642637bb48d93405026a2703095327cf5b0f12a62d7a03c02ffec" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585318 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="northd" containerID="cri-o://1142fd434aefb47b095035cb7e6ea0b08251b72a986559eddb2a5d98975576bf" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585369 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="sbdb" containerID="cri-o://f7ac30531ed71ae70a733ca813143eb11ce982a4f0afdd3339e105e499b95193" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585451 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-node" containerID="cri-o://d6ebbb1b430aa5a43ad82ccfc0e12882330301d269ada003e7d076f9fb7681ad" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585587 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-acl-logging" containerID="cri-o://2653541d8a0fac899e494dbb6ad02c37bcf960a1555e528b0a2558d4cdbd1b00" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.585728 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-controller" containerID="cri-o://3deee2e05e350b74cc8df647f0e9ebe8eff851d1da3855bd525fb218cb39db5d" gracePeriod=30 Mar 18 13:15:11 crc kubenswrapper[4912]: I0318 13:15:11.656262 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovnkube-controller" containerID="cri-o://610141d1d9fc234ba7dc855038066b3f0904a0b83247a97f972c30f9f094df37" gracePeriod=30 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.388345 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-acl-logging/0.log" Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.388952 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-controller/0.log" Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389371 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="610141d1d9fc234ba7dc855038066b3f0904a0b83247a97f972c30f9f094df37" exitCode=0 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389398 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="f7ac30531ed71ae70a733ca813143eb11ce982a4f0afdd3339e105e499b95193" exitCode=0 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389407 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="bca229aab67540876b235ab78ea602d1fb804e17f3dd261c91692cad71bc8042" exitCode=0 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389414 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="1142fd434aefb47b095035cb7e6ea0b08251b72a986559eddb2a5d98975576bf" exitCode=0 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389422 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="2653541d8a0fac899e494dbb6ad02c37bcf960a1555e528b0a2558d4cdbd1b00" exitCode=143 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389429 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="3deee2e05e350b74cc8df647f0e9ebe8eff851d1da3855bd525fb218cb39db5d" exitCode=143 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389454 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"610141d1d9fc234ba7dc855038066b3f0904a0b83247a97f972c30f9f094df37"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389509 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"f7ac30531ed71ae70a733ca813143eb11ce982a4f0afdd3339e105e499b95193"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389525 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"bca229aab67540876b235ab78ea602d1fb804e17f3dd261c91692cad71bc8042"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389539 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"1142fd434aefb47b095035cb7e6ea0b08251b72a986559eddb2a5d98975576bf"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"2653541d8a0fac899e494dbb6ad02c37bcf960a1555e528b0a2558d4cdbd1b00"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.389563 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"3deee2e05e350b74cc8df647f0e9ebe8eff851d1da3855bd525fb218cb39db5d"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.391434 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gdg7r_1b4e18f7-a93f-463f-a208-2002cdf73919/kube-multus/0.log" Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.391504 4912 generic.go:334] "Generic (PLEG): container finished" podID="1b4e18f7-a93f-463f-a208-2002cdf73919" containerID="aa30387b3d7b5eafd9e2361d21c1da8c6b47920a636403143a6e2a3d2e0e48c9" exitCode=2 Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.391545 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gdg7r" event={"ID":"1b4e18f7-a93f-463f-a208-2002cdf73919","Type":"ContainerDied","Data":"aa30387b3d7b5eafd9e2361d21c1da8c6b47920a636403143a6e2a3d2e0e48c9"} Mar 18 13:15:12 crc kubenswrapper[4912]: I0318 13:15:12.392207 4912 scope.go:117] "RemoveContainer" containerID="aa30387b3d7b5eafd9e2361d21c1da8c6b47920a636403143a6e2a3d2e0e48c9" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.406179 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-acl-logging/0.log" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.406739 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-controller/0.log" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.407115 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="c4650c42543642637bb48d93405026a2703095327cf5b0f12a62d7a03c02ffec" exitCode=0 Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.407145 4912 generic.go:334] "Generic (PLEG): container finished" podID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerID="d6ebbb1b430aa5a43ad82ccfc0e12882330301d269ada003e7d076f9fb7681ad" exitCode=0 Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.407173 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"c4650c42543642637bb48d93405026a2703095327cf5b0f12a62d7a03c02ffec"} Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.407205 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"d6ebbb1b430aa5a43ad82ccfc0e12882330301d269ada003e7d076f9fb7681ad"} Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.586191 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-acl-logging/0.log" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.588243 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-controller/0.log" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.588875 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.659930 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dc9df"] Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660345 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660373 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660387 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-node" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660396 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-node" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660419 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kubecfg-setup" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660429 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kubecfg-setup" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660447 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660456 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660466 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="nbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660474 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="nbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660488 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovnkube-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660495 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovnkube-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660508 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="sbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.660516 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="sbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.660525 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fdc363-52c1-4525-aeaf-146ab5700fb3" containerName="collect-profiles" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.661865 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fdc363-52c1-4525-aeaf-146ab5700fb3" containerName="collect-profiles" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.661891 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-acl-logging" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.661898 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-acl-logging" Mar 18 13:15:13 crc kubenswrapper[4912]: E0318 13:15:13.661908 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="northd" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.661916 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="northd" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662086 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-acl-logging" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662104 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovnkube-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662117 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="nbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662132 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="ovn-controller" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662147 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="northd" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662163 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="sbdb" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662173 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-node" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662183 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.662191 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fdc363-52c1-4525-aeaf-146ab5700fb3" containerName="collect-profiles" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.664705 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743398 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743467 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743496 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743519 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743538 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbjdr\" (UniqueName: \"kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743567 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743589 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743822 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743837 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743885 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743904 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743937 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743967 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.743988 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744063 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744091 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744155 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744180 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744200 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744228 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns\") pod \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\" (UID: \"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b\") " Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744438 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-kubelet\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744485 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-netd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744520 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-var-lib-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744555 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-ovn\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744588 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-etc-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744614 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744646 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-config\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744673 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-bin\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744697 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-systemd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744737 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-systemd-units\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744759 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744798 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-netns\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744825 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-script-lib\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744853 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f16d56e-91d7-4019-831e-1215b783f7b4-ovn-node-metrics-cert\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744877 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-slash\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744906 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-env-overrides\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744946 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.744971 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms7g\" (UniqueName: \"kubernetes.io/projected/8f16d56e-91d7-4019-831e-1215b783f7b4-kube-api-access-tms7g\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.745001 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-node-log\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.745031 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-log-socket\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.745568 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.746174 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.746162 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.746418 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747030 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747359 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log" (OuterVolumeSpecName: "node-log") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747395 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747405 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747434 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747449 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747475 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747499 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747508 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket" (OuterVolumeSpecName: "log-socket") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747538 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747571 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash" (OuterVolumeSpecName: "host-slash") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747715 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.747750 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.754874 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr" (OuterVolumeSpecName: "kube-api-access-sbjdr") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "kube-api-access-sbjdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.757183 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.792641 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" (UID: "c5fc3074-5b30-4c2d-ae24-dfa5de9b835b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846274 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-ovn\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846418 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-etc-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846502 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846579 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-config\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846664 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-bin\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846758 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-systemd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846858 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-systemd-units\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846926 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-netns\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847169 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-script-lib\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847266 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f16d56e-91d7-4019-831e-1215b783f7b4-ovn-node-metrics-cert\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847360 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-slash\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847430 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-systemd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846456 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-etc-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846415 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-ovn\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847404 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-bin\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.846583 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847519 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-netns\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847556 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-systemd-units\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847586 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-run-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847748 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-env-overrides\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847843 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847912 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tms7g\" (UniqueName: \"kubernetes.io/projected/8f16d56e-91d7-4019-831e-1215b783f7b4-kube-api-access-tms7g\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.847370 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-config\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848073 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-slash\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848107 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848010 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-node-log\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848278 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-log-socket\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848388 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-kubelet\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848515 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-var-lib-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848622 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-netd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848699 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-node-log\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848777 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-ovnkube-script-lib\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848921 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-log-socket\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.848931 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-cni-netd\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849143 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-var-lib-openvswitch\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849260 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f16d56e-91d7-4019-831e-1215b783f7b4-host-kubelet\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849373 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbjdr\" (UniqueName: \"kubernetes.io/projected/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-kube-api-access-sbjdr\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849460 4912 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849545 4912 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849620 4912 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849702 4912 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849778 4912 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849862 4912 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.849942 4912 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850020 4912 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850128 4912 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850196 4912 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850253 4912 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850319 4912 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850408 4912 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.850787 4912 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851351 4912 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851435 4912 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851512 4912 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851590 4912 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851669 4912 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.851848 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f16d56e-91d7-4019-831e-1215b783f7b4-env-overrides\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.853737 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f16d56e-91d7-4019-831e-1215b783f7b4-ovn-node-metrics-cert\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:13 crc kubenswrapper[4912]: I0318 13:15:13.886713 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms7g\" (UniqueName: \"kubernetes.io/projected/8f16d56e-91d7-4019-831e-1215b783f7b4-kube-api-access-tms7g\") pod \"ovnkube-node-dc9df\" (UID: \"8f16d56e-91d7-4019-831e-1215b783f7b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.012566 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:14 crc kubenswrapper[4912]: W0318 13:15:14.046946 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f16d56e_91d7_4019_831e_1215b783f7b4.slice/crio-58441788310f188d715a1ce3d20614b8a517a501a51f3ebbb18f360ab57774ff WatchSource:0}: Error finding container 58441788310f188d715a1ce3d20614b8a517a501a51f3ebbb18f360ab57774ff: Status 404 returned error can't find the container with id 58441788310f188d715a1ce3d20614b8a517a501a51f3ebbb18f360ab57774ff Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.418563 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-acl-logging/0.log" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.420059 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sns58_c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/ovn-controller/0.log" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.420823 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" event={"ID":"c5fc3074-5b30-4c2d-ae24-dfa5de9b835b","Type":"ContainerDied","Data":"c3a41ec619bf924cc9836633f286ebf282abcd38faa1f015861d1e6dc11fc903"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.420876 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sns58" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.420901 4912 scope.go:117] "RemoveContainer" containerID="610141d1d9fc234ba7dc855038066b3f0904a0b83247a97f972c30f9f094df37" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.423805 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" event={"ID":"0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2","Type":"ContainerStarted","Data":"087eeb479df878740fcbc41016d1f8b5fe22ce076d5e97f43520248e7d470ced"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.427178 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" event={"ID":"b1062176-da75-4c7d-a3fc-b5ecee790973","Type":"ContainerStarted","Data":"4f863e1455092cf8132ac287f47fffebd2488949c5e5cfde853d1d748938bcb7"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.427424 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.428930 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" event={"ID":"ffcc0a7f-efff-4a18-8002-7b33a557293c","Type":"ContainerStarted","Data":"2ba64e130538183d360474f8a6200bbd6bf5d65ea35f441cd4473f2cfe0acfaa"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.431328 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.435739 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" event={"ID":"37808f2f-08d5-432e-8ad6-69ad0b0e573a","Type":"ContainerStarted","Data":"c9a355ff5177c6371c2b72a064b695014339183775fdfd6b70da43458f99fe6b"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.437507 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" event={"ID":"d4045f06-e567-4dda-8192-2dbef917a7a0","Type":"ContainerStarted","Data":"9a15e45affb1af82cdffa65fbfdf053959e2e2125356af31bcb40480c452f5d0"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.438499 4912 generic.go:334] "Generic (PLEG): container finished" podID="8f16d56e-91d7-4019-831e-1215b783f7b4" containerID="099de8e6c1f09c54268b84a79532854304f5123b64df5cae6a9996a298abaf4c" exitCode=0 Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.438667 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerDied","Data":"099de8e6c1f09c54268b84a79532854304f5123b64df5cae6a9996a298abaf4c"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.438761 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"58441788310f188d715a1ce3d20614b8a517a501a51f3ebbb18f360ab57774ff"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.462834 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.477533 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gdg7r_1b4e18f7-a93f-463f-a208-2002cdf73919/kube-multus/0.log" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.477611 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gdg7r" event={"ID":"1b4e18f7-a93f-463f-a208-2002cdf73919","Type":"ContainerStarted","Data":"a9de22991ac07bbcda492b7baeabd9d11b94789658c469f41db5934dad424e9b"} Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.477812 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" podStartSLOduration=2.64596371 podStartE2EDuration="19.477789569s" podCreationTimestamp="2026-03-18 13:14:55 +0000 UTC" firstStartedPulling="2026-03-18 13:14:56.7426506 +0000 UTC m=+745.202078025" lastFinishedPulling="2026-03-18 13:15:13.574476459 +0000 UTC m=+762.033903884" observedRunningTime="2026-03-18 13:15:14.475027395 +0000 UTC m=+762.934454830" watchObservedRunningTime="2026-03-18 13:15:14.477789569 +0000 UTC m=+762.937216994" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.491000 4912 scope.go:117] "RemoveContainer" containerID="f7ac30531ed71ae70a733ca813143eb11ce982a4f0afdd3339e105e499b95193" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.525485 4912 scope.go:117] "RemoveContainer" containerID="bca229aab67540876b235ab78ea602d1fb804e17f3dd261c91692cad71bc8042" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.530639 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" podStartSLOduration=2.069628074 podStartE2EDuration="18.53061195s" podCreationTimestamp="2026-03-18 13:14:56 +0000 UTC" firstStartedPulling="2026-03-18 13:14:57.083185908 +0000 UTC m=+745.542613333" lastFinishedPulling="2026-03-18 13:15:13.544169784 +0000 UTC m=+762.003597209" observedRunningTime="2026-03-18 13:15:14.515542805 +0000 UTC m=+762.974970250" watchObservedRunningTime="2026-03-18 13:15:14.53061195 +0000 UTC m=+762.990039385" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.534814 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sns58"] Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.564917 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sns58"] Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.584649 4912 scope.go:117] "RemoveContainer" containerID="1142fd434aefb47b095035cb7e6ea0b08251b72a986559eddb2a5d98975576bf" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.586764 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-66zt6" podStartSLOduration=2.2021526160000002 podStartE2EDuration="19.586745219s" podCreationTimestamp="2026-03-18 13:14:55 +0000 UTC" firstStartedPulling="2026-03-18 13:14:56.147308241 +0000 UTC m=+744.606735666" lastFinishedPulling="2026-03-18 13:15:13.531900844 +0000 UTC m=+761.991328269" observedRunningTime="2026-03-18 13:15:14.584304284 +0000 UTC m=+763.043731719" watchObservedRunningTime="2026-03-18 13:15:14.586745219 +0000 UTC m=+763.046172644" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.615310 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-2r5xf" podStartSLOduration=2.880016475 podStartE2EDuration="20.615288007s" podCreationTimestamp="2026-03-18 13:14:54 +0000 UTC" firstStartedPulling="2026-03-18 13:14:55.838065146 +0000 UTC m=+744.297492571" lastFinishedPulling="2026-03-18 13:15:13.573336678 +0000 UTC m=+762.032764103" observedRunningTime="2026-03-18 13:15:14.612874242 +0000 UTC m=+763.072301677" watchObservedRunningTime="2026-03-18 13:15:14.615288007 +0000 UTC m=+763.074715432" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.627629 4912 scope.go:117] "RemoveContainer" containerID="c4650c42543642637bb48d93405026a2703095327cf5b0f12a62d7a03c02ffec" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.652832 4912 scope.go:117] "RemoveContainer" containerID="d6ebbb1b430aa5a43ad82ccfc0e12882330301d269ada003e7d076f9fb7681ad" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.687413 4912 scope.go:117] "RemoveContainer" containerID="2653541d8a0fac899e494dbb6ad02c37bcf960a1555e528b0a2558d4cdbd1b00" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.716106 4912 scope.go:117] "RemoveContainer" containerID="3deee2e05e350b74cc8df647f0e9ebe8eff851d1da3855bd525fb218cb39db5d" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.748135 4912 scope.go:117] "RemoveContainer" containerID="c52b03a6de15f95580fbf97f6340e58e0a48e6aa02e7684de32fd41ce749e436" Mar 18 13:15:14 crc kubenswrapper[4912]: I0318 13:15:14.754336 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d7879b9-flqsx" podStartSLOduration=2.458370587 podStartE2EDuration="19.754318066s" podCreationTimestamp="2026-03-18 13:14:55 +0000 UTC" firstStartedPulling="2026-03-18 13:14:56.243186759 +0000 UTC m=+744.702614174" lastFinishedPulling="2026-03-18 13:15:13.539134238 +0000 UTC m=+761.998561653" observedRunningTime="2026-03-18 13:15:14.754076539 +0000 UTC m=+763.213503974" watchObservedRunningTime="2026-03-18 13:15:14.754318066 +0000 UTC m=+763.213745491" Mar 18 13:15:15 crc kubenswrapper[4912]: I0318 13:15:15.486969 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"83ed75a9bee498b7666163b5b00b967aef290992ff1587627450fc188366e4ff"} Mar 18 13:15:15 crc kubenswrapper[4912]: I0318 13:15:15.487359 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"0a02234ebdcc2fcab4db4d7ba8d7801cc0ddd3cb46799efa320b940c2c06e8af"} Mar 18 13:15:15 crc kubenswrapper[4912]: I0318 13:15:15.487372 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"e7e754d0b69554d452158e6b86a49eefcdcbb5c1da0eec5ee3b95c9aebfc0605"} Mar 18 13:15:15 crc kubenswrapper[4912]: I0318 13:15:15.487386 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"e4d13a7528a37d936e71d58ea3a42e76a0653a9229c39cba79ab39ee78c99f92"} Mar 18 13:15:15 crc kubenswrapper[4912]: I0318 13:15:15.487396 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"ed83b11848ce1a9a1bad49b8ca4889f6336ba3469b1086bcc7394c80f65463f0"} Mar 18 13:15:16 crc kubenswrapper[4912]: I0318 13:15:16.236153 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fc3074-5b30-4c2d-ae24-dfa5de9b835b" path="/var/lib/kubelet/pods/c5fc3074-5b30-4c2d-ae24-dfa5de9b835b/volumes" Mar 18 13:15:16 crc kubenswrapper[4912]: I0318 13:15:16.499224 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"07832971b5feb1ca370e1ccf5039af4106978b63134feeb3e1fe4f3ff976f6c0"} Mar 18 13:15:18 crc kubenswrapper[4912]: I0318 13:15:18.518813 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"b3d0285b64a2012874a3d42ad51b1445356fdf18aaef86df818894a610165d49"} Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.534495 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" event={"ID":"8f16d56e-91d7-4019-831e-1215b783f7b4","Type":"ContainerStarted","Data":"0d2d225428f50f9237703fdedd6aa9baa4be688c70f727d93817153bfae983a5"} Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.536519 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.536552 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.536599 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.584410 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" podStartSLOduration=7.584382924 podStartE2EDuration="7.584382924s" podCreationTimestamp="2026-03-18 13:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:15:20.578940617 +0000 UTC m=+769.038368062" watchObservedRunningTime="2026-03-18 13:15:20.584382924 +0000 UTC m=+769.043810349" Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.591258 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:20 crc kubenswrapper[4912]: I0318 13:15:20.596398 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.921253 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv"] Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.923417 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.933152 4912 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2kzkn" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.933452 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.939234 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.948081 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cxrkp"] Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.949081 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.959324 4912 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mkmsj" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.974095 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-94ssf"] Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.975594 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-94ssf" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.978426 4912 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6w7q6" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.983734 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cxrkp"] Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.993200 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-94ssf"] Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.998194 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdgpn\" (UniqueName: \"kubernetes.io/projected/fab70011-1512-4414-9319-247cf2ccd2b2-kube-api-access-jdgpn\") pod \"cert-manager-858654f9db-94ssf\" (UID: \"fab70011-1512-4414-9319-247cf2ccd2b2\") " pod="cert-manager/cert-manager-858654f9db-94ssf" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.998301 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5ft\" (UniqueName: \"kubernetes.io/projected/0dcabbbc-e386-4bcf-9fc6-51e388ad3d36-kube-api-access-vr5ft\") pod \"cert-manager-cainjector-cf98fcc89-n7gmv\" (UID: \"0dcabbbc-e386-4bcf-9fc6-51e388ad3d36\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" Mar 18 13:15:25 crc kubenswrapper[4912]: I0318 13:15:25.998343 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8g8\" (UniqueName: \"kubernetes.io/projected/c672e269-a0f9-42e0-964c-ea26f3d86a58-kube-api-access-8j8g8\") pod \"cert-manager-webhook-687f57d79b-cxrkp\" (UID: \"c672e269-a0f9-42e0-964c-ea26f3d86a58\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.008866 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv"] Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.099718 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5ft\" (UniqueName: \"kubernetes.io/projected/0dcabbbc-e386-4bcf-9fc6-51e388ad3d36-kube-api-access-vr5ft\") pod \"cert-manager-cainjector-cf98fcc89-n7gmv\" (UID: \"0dcabbbc-e386-4bcf-9fc6-51e388ad3d36\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.099822 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8g8\" (UniqueName: \"kubernetes.io/projected/c672e269-a0f9-42e0-964c-ea26f3d86a58-kube-api-access-8j8g8\") pod \"cert-manager-webhook-687f57d79b-cxrkp\" (UID: \"c672e269-a0f9-42e0-964c-ea26f3d86a58\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.099900 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdgpn\" (UniqueName: \"kubernetes.io/projected/fab70011-1512-4414-9319-247cf2ccd2b2-kube-api-access-jdgpn\") pod \"cert-manager-858654f9db-94ssf\" (UID: \"fab70011-1512-4414-9319-247cf2ccd2b2\") " pod="cert-manager/cert-manager-858654f9db-94ssf" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.123439 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5ft\" (UniqueName: \"kubernetes.io/projected/0dcabbbc-e386-4bcf-9fc6-51e388ad3d36-kube-api-access-vr5ft\") pod \"cert-manager-cainjector-cf98fcc89-n7gmv\" (UID: \"0dcabbbc-e386-4bcf-9fc6-51e388ad3d36\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.123529 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdgpn\" (UniqueName: \"kubernetes.io/projected/fab70011-1512-4414-9319-247cf2ccd2b2-kube-api-access-jdgpn\") pod \"cert-manager-858654f9db-94ssf\" (UID: \"fab70011-1512-4414-9319-247cf2ccd2b2\") " pod="cert-manager/cert-manager-858654f9db-94ssf" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.125305 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8g8\" (UniqueName: \"kubernetes.io/projected/c672e269-a0f9-42e0-964c-ea26f3d86a58-kube-api-access-8j8g8\") pod \"cert-manager-webhook-687f57d79b-cxrkp\" (UID: \"c672e269-a0f9-42e0-964c-ea26f3d86a58\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.243195 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.264410 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.293249 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-94ssf" Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.560636 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cxrkp"] Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.572253 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.581175 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" event={"ID":"c672e269-a0f9-42e0-964c-ea26f3d86a58","Type":"ContainerStarted","Data":"0321d11b040599f90ad16412b5c9e3c89825d0a1c310f9b05140dd58fdf0b5ee"} Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.629362 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv"] Mar 18 13:15:26 crc kubenswrapper[4912]: W0318 13:15:26.630968 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dcabbbc_e386_4bcf_9fc6_51e388ad3d36.slice/crio-6f219f2f6a45325cfd6e53f13d849a02930dc886857c81a53f57cce514d5d090 WatchSource:0}: Error finding container 6f219f2f6a45325cfd6e53f13d849a02930dc886857c81a53f57cce514d5d090: Status 404 returned error can't find the container with id 6f219f2f6a45325cfd6e53f13d849a02930dc886857c81a53f57cce514d5d090 Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.687199 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-94ssf"] Mar 18 13:15:26 crc kubenswrapper[4912]: W0318 13:15:26.692715 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab70011_1512_4414_9319_247cf2ccd2b2.slice/crio-29d794e039ea9f50ea2b7bd6d6db99184c7d374b085af327c8db994fb5a76d9d WatchSource:0}: Error finding container 29d794e039ea9f50ea2b7bd6d6db99184c7d374b085af327c8db994fb5a76d9d: Status 404 returned error can't find the container with id 29d794e039ea9f50ea2b7bd6d6db99184c7d374b085af327c8db994fb5a76d9d Mar 18 13:15:26 crc kubenswrapper[4912]: I0318 13:15:26.734603 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" Mar 18 13:15:27 crc kubenswrapper[4912]: I0318 13:15:27.589433 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" event={"ID":"0dcabbbc-e386-4bcf-9fc6-51e388ad3d36","Type":"ContainerStarted","Data":"6f219f2f6a45325cfd6e53f13d849a02930dc886857c81a53f57cce514d5d090"} Mar 18 13:15:27 crc kubenswrapper[4912]: I0318 13:15:27.590938 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-94ssf" event={"ID":"fab70011-1512-4414-9319-247cf2ccd2b2","Type":"ContainerStarted","Data":"29d794e039ea9f50ea2b7bd6d6db99184c7d374b085af327c8db994fb5a76d9d"} Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.665947 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" event={"ID":"0dcabbbc-e386-4bcf-9fc6-51e388ad3d36","Type":"ContainerStarted","Data":"8f2d3bc50f93e0b39a1d705b90825de14648225d1a0adde0a672cede39aaa1ec"} Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.668673 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-94ssf" event={"ID":"fab70011-1512-4414-9319-247cf2ccd2b2","Type":"ContainerStarted","Data":"95af27f86ce4d78a22cf2f4061e7fdf7eb269081ff075e85bd2e09c5bb82afd3"} Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.670490 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" event={"ID":"c672e269-a0f9-42e0-964c-ea26f3d86a58","Type":"ContainerStarted","Data":"9fc0e34522bcaa0a6df9a9160bd32679596805fe4f93c5c16453ef2f77409643"} Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.670638 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.690481 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n7gmv" podStartSLOduration=2.498658677 podStartE2EDuration="9.690459761s" podCreationTimestamp="2026-03-18 13:15:25 +0000 UTC" firstStartedPulling="2026-03-18 13:15:26.632949186 +0000 UTC m=+775.092376611" lastFinishedPulling="2026-03-18 13:15:33.82475026 +0000 UTC m=+782.284177695" observedRunningTime="2026-03-18 13:15:34.687587954 +0000 UTC m=+783.147015379" watchObservedRunningTime="2026-03-18 13:15:34.690459761 +0000 UTC m=+783.149887176" Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.731390 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-94ssf" podStartSLOduration=2.53296856 podStartE2EDuration="9.731362931s" podCreationTimestamp="2026-03-18 13:15:25 +0000 UTC" firstStartedPulling="2026-03-18 13:15:26.695963081 +0000 UTC m=+775.155390506" lastFinishedPulling="2026-03-18 13:15:33.894357452 +0000 UTC m=+782.353784877" observedRunningTime="2026-03-18 13:15:34.729190492 +0000 UTC m=+783.188617917" watchObservedRunningTime="2026-03-18 13:15:34.731362931 +0000 UTC m=+783.190790356" Mar 18 13:15:34 crc kubenswrapper[4912]: I0318 13:15:34.731837 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" podStartSLOduration=2.470307614 podStartE2EDuration="9.731830963s" podCreationTimestamp="2026-03-18 13:15:25 +0000 UTC" firstStartedPulling="2026-03-18 13:15:26.571073572 +0000 UTC m=+775.030500987" lastFinishedPulling="2026-03-18 13:15:33.832596911 +0000 UTC m=+782.292024336" observedRunningTime="2026-03-18 13:15:34.713536941 +0000 UTC m=+783.172964366" watchObservedRunningTime="2026-03-18 13:15:34.731830963 +0000 UTC m=+783.191258388" Mar 18 13:15:41 crc kubenswrapper[4912]: I0318 13:15:41.268383 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" Mar 18 13:15:44 crc kubenswrapper[4912]: I0318 13:15:44.044368 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dc9df" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.258026 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.260828 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.278093 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.362730 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.363609 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2s8\" (UniqueName: \"kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.363635 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.464527 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.464596 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2s8\" (UniqueName: \"kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.464626 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.465008 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.465098 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.487545 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2s8\" (UniqueName: \"kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8\") pod \"certified-operators-pn8pb\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.595439 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:15:57 crc kubenswrapper[4912]: I0318 13:15:57.893015 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:15:58 crc kubenswrapper[4912]: I0318 13:15:58.858107 4912 generic.go:334] "Generic (PLEG): container finished" podID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerID="983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a" exitCode=0 Mar 18 13:15:58 crc kubenswrapper[4912]: I0318 13:15:58.858170 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerDied","Data":"983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a"} Mar 18 13:15:58 crc kubenswrapper[4912]: I0318 13:15:58.858206 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerStarted","Data":"5bafa1563a9e76b546bc469f00cc989c8e653d75378ba1620d99be3e0570d8c7"} Mar 18 13:15:59 crc kubenswrapper[4912]: I0318 13:15:59.870940 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerStarted","Data":"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5"} Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.143005 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563996-qxbxk"] Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.145819 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.149667 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-qxbxk"] Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.189249 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.190142 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.190189 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.222148 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqtz\" (UniqueName: \"kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz\") pod \"auto-csr-approver-29563996-qxbxk\" (UID: \"203f8736-4c2f-46af-8bfb-191c0d12f031\") " pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.324159 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqtz\" (UniqueName: \"kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz\") pod \"auto-csr-approver-29563996-qxbxk\" (UID: \"203f8736-4c2f-46af-8bfb-191c0d12f031\") " pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.345930 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqtz\" (UniqueName: \"kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz\") pod \"auto-csr-approver-29563996-qxbxk\" (UID: \"203f8736-4c2f-46af-8bfb-191c0d12f031\") " pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.542884 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.776023 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-qxbxk"] Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.880409 4912 generic.go:334] "Generic (PLEG): container finished" podID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerID="fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5" exitCode=0 Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.880457 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerDied","Data":"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5"} Mar 18 13:16:00 crc kubenswrapper[4912]: I0318 13:16:00.882260 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" event={"ID":"203f8736-4c2f-46af-8bfb-191c0d12f031","Type":"ContainerStarted","Data":"da7f61c55c66123c267571b431fc991416eb45a65f065a100abd98ee30c052e5"} Mar 18 13:16:02 crc kubenswrapper[4912]: I0318 13:16:02.909417 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerStarted","Data":"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6"} Mar 18 13:16:02 crc kubenswrapper[4912]: I0318 13:16:02.933488 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pn8pb" podStartSLOduration=2.468976605 podStartE2EDuration="5.933464689s" podCreationTimestamp="2026-03-18 13:15:57 +0000 UTC" firstStartedPulling="2026-03-18 13:15:58.861784906 +0000 UTC m=+807.321212351" lastFinishedPulling="2026-03-18 13:16:02.326273 +0000 UTC m=+810.785700435" observedRunningTime="2026-03-18 13:16:02.931076905 +0000 UTC m=+811.390504340" watchObservedRunningTime="2026-03-18 13:16:02.933464689 +0000 UTC m=+811.392892114" Mar 18 13:16:03 crc kubenswrapper[4912]: I0318 13:16:03.918184 4912 generic.go:334] "Generic (PLEG): container finished" podID="203f8736-4c2f-46af-8bfb-191c0d12f031" containerID="9e65f7f2871c61a10f1b4fab46e56bd94dc25d496a74b4f891c21f68521ebf6c" exitCode=0 Mar 18 13:16:03 crc kubenswrapper[4912]: I0318 13:16:03.918373 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" event={"ID":"203f8736-4c2f-46af-8bfb-191c0d12f031","Type":"ContainerDied","Data":"9e65f7f2871c61a10f1b4fab46e56bd94dc25d496a74b4f891c21f68521ebf6c"} Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.190210 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.214783 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njqtz\" (UniqueName: \"kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz\") pod \"203f8736-4c2f-46af-8bfb-191c0d12f031\" (UID: \"203f8736-4c2f-46af-8bfb-191c0d12f031\") " Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.221351 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz" (OuterVolumeSpecName: "kube-api-access-njqtz") pod "203f8736-4c2f-46af-8bfb-191c0d12f031" (UID: "203f8736-4c2f-46af-8bfb-191c0d12f031"). InnerVolumeSpecName "kube-api-access-njqtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.316782 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njqtz\" (UniqueName: \"kubernetes.io/projected/203f8736-4c2f-46af-8bfb-191c0d12f031-kube-api-access-njqtz\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.943611 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" event={"ID":"203f8736-4c2f-46af-8bfb-191c0d12f031","Type":"ContainerDied","Data":"da7f61c55c66123c267571b431fc991416eb45a65f065a100abd98ee30c052e5"} Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.944350 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7f61c55c66123c267571b431fc991416eb45a65f065a100abd98ee30c052e5" Mar 18 13:16:05 crc kubenswrapper[4912]: I0318 13:16:05.944582 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-qxbxk" Mar 18 13:16:06 crc kubenswrapper[4912]: I0318 13:16:06.255885 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-k6q9w"] Mar 18 13:16:06 crc kubenswrapper[4912]: I0318 13:16:06.262801 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-k6q9w"] Mar 18 13:16:07 crc kubenswrapper[4912]: I0318 13:16:07.596488 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:07 crc kubenswrapper[4912]: I0318 13:16:07.596563 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:07 crc kubenswrapper[4912]: I0318 13:16:07.679325 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:08 crc kubenswrapper[4912]: I0318 13:16:08.003354 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:08 crc kubenswrapper[4912]: I0318 13:16:08.055181 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:16:08 crc kubenswrapper[4912]: I0318 13:16:08.238204 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7074714-b00d-490c-83de-bceb48442c19" path="/var/lib/kubelet/pods/a7074714-b00d-490c-83de-bceb48442c19/volumes" Mar 18 13:16:09 crc kubenswrapper[4912]: I0318 13:16:09.972302 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pn8pb" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="registry-server" containerID="cri-o://b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6" gracePeriod=2 Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.655922 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.722772 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities\") pod \"31fd4b06-fc40-4755-9bdd-de49df0960ef\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.722859 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content\") pod \"31fd4b06-fc40-4755-9bdd-de49df0960ef\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.722967 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2s8\" (UniqueName: \"kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8\") pod \"31fd4b06-fc40-4755-9bdd-de49df0960ef\" (UID: \"31fd4b06-fc40-4755-9bdd-de49df0960ef\") " Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.723909 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities" (OuterVolumeSpecName: "utilities") pod "31fd4b06-fc40-4755-9bdd-de49df0960ef" (UID: "31fd4b06-fc40-4755-9bdd-de49df0960ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.733615 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8" (OuterVolumeSpecName: "kube-api-access-kf2s8") pod "31fd4b06-fc40-4755-9bdd-de49df0960ef" (UID: "31fd4b06-fc40-4755-9bdd-de49df0960ef"). InnerVolumeSpecName "kube-api-access-kf2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.810673 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fd4b06-fc40-4755-9bdd-de49df0960ef" (UID: "31fd4b06-fc40-4755-9bdd-de49df0960ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821213 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9"] Mar 18 13:16:11 crc kubenswrapper[4912]: E0318 13:16:11.821647 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203f8736-4c2f-46af-8bfb-191c0d12f031" containerName="oc" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821668 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f8736-4c2f-46af-8bfb-191c0d12f031" containerName="oc" Mar 18 13:16:11 crc kubenswrapper[4912]: E0318 13:16:11.821683 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="registry-server" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821691 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="registry-server" Mar 18 13:16:11 crc kubenswrapper[4912]: E0318 13:16:11.821705 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="extract-utilities" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821712 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="extract-utilities" Mar 18 13:16:11 crc kubenswrapper[4912]: E0318 13:16:11.821725 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="extract-content" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821731 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="extract-content" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821866 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerName="registry-server" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.821879 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="203f8736-4c2f-46af-8bfb-191c0d12f031" containerName="oc" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.822831 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.824595 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf2s8\" (UniqueName: \"kubernetes.io/projected/31fd4b06-fc40-4755-9bdd-de49df0960ef-kube-api-access-kf2s8\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.824637 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.824652 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fd4b06-fc40-4755-9bdd-de49df0960ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.826580 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.844544 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9"] Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.925857 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.925918 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6l4\" (UniqueName: \"kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.925974 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.980153 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969"] Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.981525 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.992328 4912 generic.go:334] "Generic (PLEG): container finished" podID="31fd4b06-fc40-4755-9bdd-de49df0960ef" containerID="b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6" exitCode=0 Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.992377 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerDied","Data":"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6"} Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.992396 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn8pb" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.992426 4912 scope.go:117] "RemoveContainer" containerID="b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6" Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.992411 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn8pb" event={"ID":"31fd4b06-fc40-4755-9bdd-de49df0960ef","Type":"ContainerDied","Data":"5bafa1563a9e76b546bc469f00cc989c8e653d75378ba1620d99be3e0570d8c7"} Mar 18 13:16:11 crc kubenswrapper[4912]: I0318 13:16:11.994096 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969"] Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027754 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027834 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027858 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027882 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027915 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6l4\" (UniqueName: \"kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.027966 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkh8\" (UniqueName: \"kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.028468 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.028520 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.043123 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.053748 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pn8pb"] Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.055662 4912 scope.go:117] "RemoveContainer" containerID="fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.074774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6l4\" (UniqueName: \"kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.084777 4912 scope.go:117] "RemoveContainer" containerID="983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.105448 4912 scope.go:117] "RemoveContainer" containerID="b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6" Mar 18 13:16:12 crc kubenswrapper[4912]: E0318 13:16:12.106120 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6\": container with ID starting with b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6 not found: ID does not exist" containerID="b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.106191 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6"} err="failed to get container status \"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6\": rpc error: code = NotFound desc = could not find container \"b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6\": container with ID starting with b37e8374defc2ad45ccf183460a4d6058bc9142b7ae9a00e505a70ac067f40e6 not found: ID does not exist" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.106240 4912 scope.go:117] "RemoveContainer" containerID="fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5" Mar 18 13:16:12 crc kubenswrapper[4912]: E0318 13:16:12.108398 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5\": container with ID starting with fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5 not found: ID does not exist" containerID="fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.108447 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5"} err="failed to get container status \"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5\": rpc error: code = NotFound desc = could not find container \"fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5\": container with ID starting with fbf2bd2272581b7e0c4c894412750e86258833532685a6f94aff6eb62ac1e2e5 not found: ID does not exist" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.108475 4912 scope.go:117] "RemoveContainer" containerID="983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a" Mar 18 13:16:12 crc kubenswrapper[4912]: E0318 13:16:12.108830 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a\": container with ID starting with 983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a not found: ID does not exist" containerID="983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.108876 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a"} err="failed to get container status \"983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a\": rpc error: code = NotFound desc = could not find container \"983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a\": container with ID starting with 983644ba76c7e26f13a0a5f4fba5341d874c817dda351599319008c12d969e9a not found: ID does not exist" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.129752 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.129818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.129904 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkh8\" (UniqueName: \"kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.130382 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.130475 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.141364 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.148086 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkh8\" (UniqueName: \"kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.242487 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fd4b06-fc40-4755-9bdd-de49df0960ef" path="/var/lib/kubelet/pods/31fd4b06-fc40-4755-9bdd-de49df0960ef/volumes" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.301574 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.389350 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9"] Mar 18 13:16:12 crc kubenswrapper[4912]: W0318 13:16:12.417158 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795cc23a_a174_4e21_8a01_6f631f937583.slice/crio-738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82 WatchSource:0}: Error finding container 738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82: Status 404 returned error can't find the container with id 738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82 Mar 18 13:16:12 crc kubenswrapper[4912]: I0318 13:16:12.560602 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969"] Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.008169 4912 generic.go:334] "Generic (PLEG): container finished" podID="795cc23a-a174-4e21-8a01-6f631f937583" containerID="4f32558bfedf3b2fa0b69a990b63a63e8b50017e46f1ea209c284977ca2cc932" exitCode=0 Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.008317 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" event={"ID":"795cc23a-a174-4e21-8a01-6f631f937583","Type":"ContainerDied","Data":"4f32558bfedf3b2fa0b69a990b63a63e8b50017e46f1ea209c284977ca2cc932"} Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.010284 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" event={"ID":"795cc23a-a174-4e21-8a01-6f631f937583","Type":"ContainerStarted","Data":"738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82"} Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.017892 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerID="8926eeba151a7dc1068d3f888dc61f88ebff6e3a362f2e1ac7f93099e4ffbad1" exitCode=0 Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.017978 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" event={"ID":"f5b8d99c-4d5f-4c25-a12b-3d45756fced8","Type":"ContainerDied","Data":"8926eeba151a7dc1068d3f888dc61f88ebff6e3a362f2e1ac7f93099e4ffbad1"} Mar 18 13:16:13 crc kubenswrapper[4912]: I0318 13:16:13.018082 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" event={"ID":"f5b8d99c-4d5f-4c25-a12b-3d45756fced8","Type":"ContainerStarted","Data":"4b3ca298036a6eb83d70604fe7223549c6434da35bb4edf0452eb25e2e1e168e"} Mar 18 13:16:15 crc kubenswrapper[4912]: I0318 13:16:15.034680 4912 generic.go:334] "Generic (PLEG): container finished" podID="795cc23a-a174-4e21-8a01-6f631f937583" containerID="3897e17c0e53ea32fb88ef90477226e6ae19f34f87e34bbf7990b5f8c6d656d8" exitCode=0 Mar 18 13:16:15 crc kubenswrapper[4912]: I0318 13:16:15.034762 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" event={"ID":"795cc23a-a174-4e21-8a01-6f631f937583","Type":"ContainerDied","Data":"3897e17c0e53ea32fb88ef90477226e6ae19f34f87e34bbf7990b5f8c6d656d8"} Mar 18 13:16:15 crc kubenswrapper[4912]: I0318 13:16:15.038122 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerID="e8c748bcc14b611e4a4ca092288f3b8943b568307deef484a83b1575350b4fa0" exitCode=0 Mar 18 13:16:15 crc kubenswrapper[4912]: I0318 13:16:15.038177 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" event={"ID":"f5b8d99c-4d5f-4c25-a12b-3d45756fced8","Type":"ContainerDied","Data":"e8c748bcc14b611e4a4ca092288f3b8943b568307deef484a83b1575350b4fa0"} Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.050590 4912 generic.go:334] "Generic (PLEG): container finished" podID="795cc23a-a174-4e21-8a01-6f631f937583" containerID="a849b484c420588641eed34d4a119fad75e1e8cbed7739c4897ac97fd632a217" exitCode=0 Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.050735 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" event={"ID":"795cc23a-a174-4e21-8a01-6f631f937583","Type":"ContainerDied","Data":"a849b484c420588641eed34d4a119fad75e1e8cbed7739c4897ac97fd632a217"} Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.053841 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerID="6e785b96cb4c28cd2d6f2ac2ff730931670de918719372fab79581153bf2b880" exitCode=0 Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.053896 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" event={"ID":"f5b8d99c-4d5f-4c25-a12b-3d45756fced8","Type":"ContainerDied","Data":"6e785b96cb4c28cd2d6f2ac2ff730931670de918719372fab79581153bf2b880"} Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.730987 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.734338 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.747305 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.817663 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.817743 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.817927 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftxr\" (UniqueName: \"kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.920980 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.921122 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftxr\" (UniqueName: \"kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.921271 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.921834 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.921923 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:16 crc kubenswrapper[4912]: I0318 13:16:16.948083 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftxr\" (UniqueName: \"kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr\") pod \"redhat-operators-vj869\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.061468 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.396294 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.436203 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle\") pod \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.436394 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util\") pod \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.436461 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwkh8\" (UniqueName: \"kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8\") pod \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\" (UID: \"f5b8d99c-4d5f-4c25-a12b-3d45756fced8\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.439327 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle" (OuterVolumeSpecName: "bundle") pod "f5b8d99c-4d5f-4c25-a12b-3d45756fced8" (UID: "f5b8d99c-4d5f-4c25-a12b-3d45756fced8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.448869 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8" (OuterVolumeSpecName: "kube-api-access-vwkh8") pod "f5b8d99c-4d5f-4c25-a12b-3d45756fced8" (UID: "f5b8d99c-4d5f-4c25-a12b-3d45756fced8"). InnerVolumeSpecName "kube-api-access-vwkh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.465236 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util" (OuterVolumeSpecName: "util") pod "f5b8d99c-4d5f-4c25-a12b-3d45756fced8" (UID: "f5b8d99c-4d5f-4c25-a12b-3d45756fced8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.504823 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.538106 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.538145 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwkh8\" (UniqueName: \"kubernetes.io/projected/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-kube-api-access-vwkh8\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.538157 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5b8d99c-4d5f-4c25-a12b-3d45756fced8-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.586768 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.639972 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle\") pod \"795cc23a-a174-4e21-8a01-6f631f937583\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.640064 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util\") pod \"795cc23a-a174-4e21-8a01-6f631f937583\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.640206 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s6l4\" (UniqueName: \"kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4\") pod \"795cc23a-a174-4e21-8a01-6f631f937583\" (UID: \"795cc23a-a174-4e21-8a01-6f631f937583\") " Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.640905 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle" (OuterVolumeSpecName: "bundle") pod "795cc23a-a174-4e21-8a01-6f631f937583" (UID: "795cc23a-a174-4e21-8a01-6f631f937583"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.646423 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4" (OuterVolumeSpecName: "kube-api-access-6s6l4") pod "795cc23a-a174-4e21-8a01-6f631f937583" (UID: "795cc23a-a174-4e21-8a01-6f631f937583"). InnerVolumeSpecName "kube-api-access-6s6l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.742223 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s6l4\" (UniqueName: \"kubernetes.io/projected/795cc23a-a174-4e21-8a01-6f631f937583-kube-api-access-6s6l4\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.742264 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.938567 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util" (OuterVolumeSpecName: "util") pod "795cc23a-a174-4e21-8a01-6f631f937583" (UID: "795cc23a-a174-4e21-8a01-6f631f937583"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:17 crc kubenswrapper[4912]: I0318 13:16:17.945537 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cc23a-a174-4e21-8a01-6f631f937583-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.070448 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" event={"ID":"f5b8d99c-4d5f-4c25-a12b-3d45756fced8","Type":"ContainerDied","Data":"4b3ca298036a6eb83d70604fe7223549c6434da35bb4edf0452eb25e2e1e168e"} Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.071128 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b3ca298036a6eb83d70604fe7223549c6434da35bb4edf0452eb25e2e1e168e" Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.070499 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969" Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.071580 4912 generic.go:334] "Generic (PLEG): container finished" podID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerID="972a909d9567a2729683fb110d6d48ebc4297f29b281b4a4a4f64faf3febc3b6" exitCode=0 Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.071655 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerDied","Data":"972a909d9567a2729683fb110d6d48ebc4297f29b281b4a4a4f64faf3febc3b6"} Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.071688 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerStarted","Data":"8882d7fb0fa68651691b141c9ed440277d0d8672c079b208a7df1bb05575be8f"} Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.075135 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" event={"ID":"795cc23a-a174-4e21-8a01-6f631f937583","Type":"ContainerDied","Data":"738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82"} Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.075171 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="738e11138d30f7c7f8d048f3ebe9ca784a729e2e7cf51844e910adb064392b82" Mar 18 13:16:18 crc kubenswrapper[4912]: I0318 13:16:18.075249 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9" Mar 18 13:16:19 crc kubenswrapper[4912]: I0318 13:16:19.085241 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerStarted","Data":"084e940d10ecba46378e97a68a1d165055287978787835ccb3b00c22505afc8d"} Mar 18 13:16:21 crc kubenswrapper[4912]: I0318 13:16:21.103205 4912 generic.go:334] "Generic (PLEG): container finished" podID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerID="084e940d10ecba46378e97a68a1d165055287978787835ccb3b00c22505afc8d" exitCode=0 Mar 18 13:16:21 crc kubenswrapper[4912]: I0318 13:16:21.103675 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerDied","Data":"084e940d10ecba46378e97a68a1d165055287978787835ccb3b00c22505afc8d"} Mar 18 13:16:22 crc kubenswrapper[4912]: I0318 13:16:22.173349 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerStarted","Data":"c586b1ab3e82fd94ed4508e0d2d65e601e8ec4e55121be31ab9df9c23e3ba167"} Mar 18 13:16:22 crc kubenswrapper[4912]: I0318 13:16:22.222217 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vj869" podStartSLOduration=2.589137145 podStartE2EDuration="6.222196083s" podCreationTimestamp="2026-03-18 13:16:16 +0000 UTC" firstStartedPulling="2026-03-18 13:16:18.073809689 +0000 UTC m=+826.533237114" lastFinishedPulling="2026-03-18 13:16:21.706868627 +0000 UTC m=+830.166296052" observedRunningTime="2026-03-18 13:16:22.212300527 +0000 UTC m=+830.671727952" watchObservedRunningTime="2026-03-18 13:16:22.222196083 +0000 UTC m=+830.681623508" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.810324 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct"] Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.812315 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="pull" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.812691 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="pull" Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.812799 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="util" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.812870 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="util" Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.813080 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="util" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.813162 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="util" Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.813245 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.813355 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.813765 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="pull" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.813847 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="pull" Mar 18 13:16:24 crc kubenswrapper[4912]: E0318 13:16:24.813916 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.813983 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.814276 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="795cc23a-a174-4e21-8a01-6f631f937583" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.814384 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b8d99c-4d5f-4c25-a12b-3d45756fced8" containerName="extract" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.815856 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.818162 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.818524 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.818628 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.818895 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.820065 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.821036 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-l7hcs" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.835405 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct"] Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.863026 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-webhook-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.864685 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.864858 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c5k9\" (UniqueName: \"kubernetes.io/projected/8efdcb68-92df-434c-8446-5be1ef0a94ba-kube-api-access-4c5k9\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.865780 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-apiservice-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.865871 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8efdcb68-92df-434c-8446-5be1ef0a94ba-manager-config\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.967359 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-webhook-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.967790 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.967906 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c5k9\" (UniqueName: \"kubernetes.io/projected/8efdcb68-92df-434c-8446-5be1ef0a94ba-kube-api-access-4c5k9\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.967994 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-apiservice-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.968095 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8efdcb68-92df-434c-8446-5be1ef0a94ba-manager-config\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.969209 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8efdcb68-92df-434c-8446-5be1ef0a94ba-manager-config\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.979005 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-webhook-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.988889 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.992629 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efdcb68-92df-434c-8446-5be1ef0a94ba-apiservice-cert\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:24 crc kubenswrapper[4912]: I0318 13:16:24.998895 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c5k9\" (UniqueName: \"kubernetes.io/projected/8efdcb68-92df-434c-8446-5be1ef0a94ba-kube-api-access-4c5k9\") pod \"loki-operator-controller-manager-867987c6b7-jg2ct\" (UID: \"8efdcb68-92df-434c-8446-5be1ef0a94ba\") " pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:25 crc kubenswrapper[4912]: I0318 13:16:25.137966 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:25 crc kubenswrapper[4912]: I0318 13:16:25.472678 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct"] Mar 18 13:16:26 crc kubenswrapper[4912]: I0318 13:16:26.202180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" event={"ID":"8efdcb68-92df-434c-8446-5be1ef0a94ba","Type":"ContainerStarted","Data":"f38dea9ef80f3c114a03d82afee17a990e9ea000ae9567286b590ce64b08a4fc"} Mar 18 13:16:27 crc kubenswrapper[4912]: I0318 13:16:27.065189 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:27 crc kubenswrapper[4912]: I0318 13:16:27.065326 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:28 crc kubenswrapper[4912]: I0318 13:16:28.140797 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vj869" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="registry-server" probeResult="failure" output=< Mar 18 13:16:28 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:16:28 crc kubenswrapper[4912]: > Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.112703 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9lr46"] Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.116287 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.123447 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.124108 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.125408 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-kclmd" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.143383 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9lr46"] Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.179539 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27q8\" (UniqueName: \"kubernetes.io/projected/fb5bb2a5-d719-49dd-a9b3-8734f6944648-kube-api-access-t27q8\") pod \"cluster-logging-operator-66689c4bbf-9lr46\" (UID: \"fb5bb2a5-d719-49dd-a9b3-8734f6944648\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.280854 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27q8\" (UniqueName: \"kubernetes.io/projected/fb5bb2a5-d719-49dd-a9b3-8734f6944648-kube-api-access-t27q8\") pod \"cluster-logging-operator-66689c4bbf-9lr46\" (UID: \"fb5bb2a5-d719-49dd-a9b3-8734f6944648\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.328379 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27q8\" (UniqueName: \"kubernetes.io/projected/fb5bb2a5-d719-49dd-a9b3-8734f6944648-kube-api-access-t27q8\") pod \"cluster-logging-operator-66689c4bbf-9lr46\" (UID: \"fb5bb2a5-d719-49dd-a9b3-8734f6944648\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" Mar 18 13:16:31 crc kubenswrapper[4912]: I0318 13:16:31.462115 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" Mar 18 13:16:32 crc kubenswrapper[4912]: I0318 13:16:32.332135 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9lr46"] Mar 18 13:16:32 crc kubenswrapper[4912]: W0318 13:16:32.365489 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5bb2a5_d719_49dd_a9b3_8734f6944648.slice/crio-dd3e20ff0871b9159a6e6722ad993f08803b6c4df5808d80f09b45a1a3dd272b WatchSource:0}: Error finding container dd3e20ff0871b9159a6e6722ad993f08803b6c4df5808d80f09b45a1a3dd272b: Status 404 returned error can't find the container with id dd3e20ff0871b9159a6e6722ad993f08803b6c4df5808d80f09b45a1a3dd272b Mar 18 13:16:33 crc kubenswrapper[4912]: I0318 13:16:33.275263 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" event={"ID":"8efdcb68-92df-434c-8446-5be1ef0a94ba","Type":"ContainerStarted","Data":"530a49354dcf66a5b129a74c75a194004b2b0eb590109f20f74f88253195b02f"} Mar 18 13:16:33 crc kubenswrapper[4912]: I0318 13:16:33.277501 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" event={"ID":"fb5bb2a5-d719-49dd-a9b3-8734f6944648","Type":"ContainerStarted","Data":"dd3e20ff0871b9159a6e6722ad993f08803b6c4df5808d80f09b45a1a3dd272b"} Mar 18 13:16:33 crc kubenswrapper[4912]: I0318 13:16:33.694408 4912 scope.go:117] "RemoveContainer" containerID="096d815c6e88750ce17f4f80b15927a1dbe11bb161a0ebc48319486a8a784274" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.532556 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.537187 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.550024 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.581365 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.581472 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.581531 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqc48\" (UniqueName: \"kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.682508 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.682615 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqc48\" (UniqueName: \"kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.682695 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.683352 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.683489 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.702990 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqc48\" (UniqueName: \"kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48\") pod \"community-operators-7d7mh\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:36 crc kubenswrapper[4912]: I0318 13:16:36.866016 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:37 crc kubenswrapper[4912]: I0318 13:16:37.123387 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:37 crc kubenswrapper[4912]: I0318 13:16:37.192918 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:40 crc kubenswrapper[4912]: I0318 13:16:40.727636 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:40 crc kubenswrapper[4912]: I0318 13:16:40.728898 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vj869" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="registry-server" containerID="cri-o://c586b1ab3e82fd94ed4508e0d2d65e601e8ec4e55121be31ab9df9c23e3ba167" gracePeriod=2 Mar 18 13:16:41 crc kubenswrapper[4912]: I0318 13:16:41.349177 4912 generic.go:334] "Generic (PLEG): container finished" podID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerID="c586b1ab3e82fd94ed4508e0d2d65e601e8ec4e55121be31ab9df9c23e3ba167" exitCode=0 Mar 18 13:16:41 crc kubenswrapper[4912]: I0318 13:16:41.349582 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerDied","Data":"c586b1ab3e82fd94ed4508e0d2d65e601e8ec4e55121be31ab9df9c23e3ba167"} Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.726437 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.826692 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content\") pod \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.826888 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities\") pod \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.826930 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gftxr\" (UniqueName: \"kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr\") pod \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\" (UID: \"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6\") " Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.829300 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities" (OuterVolumeSpecName: "utilities") pod "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" (UID: "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.836336 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr" (OuterVolumeSpecName: "kube-api-access-gftxr") pod "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" (UID: "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6"). InnerVolumeSpecName "kube-api-access-gftxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.897217 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.928503 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.928547 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gftxr\" (UniqueName: \"kubernetes.io/projected/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-kube-api-access-gftxr\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:43 crc kubenswrapper[4912]: I0318 13:16:43.975427 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" (UID: "1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.029580 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.380078 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj869" event={"ID":"1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6","Type":"ContainerDied","Data":"8882d7fb0fa68651691b141c9ed440277d0d8672c079b208a7df1bb05575be8f"} Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.380153 4912 scope.go:117] "RemoveContainer" containerID="c586b1ab3e82fd94ed4508e0d2d65e601e8ec4e55121be31ab9df9c23e3ba167" Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.380196 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj869" Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.405520 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.410920 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vj869"] Mar 18 13:16:44 crc kubenswrapper[4912]: W0318 13:16:44.439743 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e0c42c_0552_4b21_a68d_ccdb0e166896.slice/crio-b1347cf789e89ff2e97b950f95f37b0fa714fda5c4e7145bae17903f1224e20e WatchSource:0}: Error finding container b1347cf789e89ff2e97b950f95f37b0fa714fda5c4e7145bae17903f1224e20e: Status 404 returned error can't find the container with id b1347cf789e89ff2e97b950f95f37b0fa714fda5c4e7145bae17903f1224e20e Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.448411 4912 scope.go:117] "RemoveContainer" containerID="084e940d10ecba46378e97a68a1d165055287978787835ccb3b00c22505afc8d" Mar 18 13:16:44 crc kubenswrapper[4912]: I0318 13:16:44.491667 4912 scope.go:117] "RemoveContainer" containerID="972a909d9567a2729683fb110d6d48ebc4297f29b281b4a4a4f64faf3febc3b6" Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.390945 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" event={"ID":"8efdcb68-92df-434c-8446-5be1ef0a94ba","Type":"ContainerStarted","Data":"04f30e4800a0fef31f4a2fea15e9752490063483fc9432183c51807c484a17c9"} Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.391453 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.395027 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.398292 4912 generic.go:334] "Generic (PLEG): container finished" podID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerID="a5afdfa46683c4eec3a485625d2065527428cc7886e6d257e1d056500ca27fa8" exitCode=0 Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.398502 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerDied","Data":"a5afdfa46683c4eec3a485625d2065527428cc7886e6d257e1d056500ca27fa8"} Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.398578 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerStarted","Data":"b1347cf789e89ff2e97b950f95f37b0fa714fda5c4e7145bae17903f1224e20e"} Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.400728 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" event={"ID":"fb5bb2a5-d719-49dd-a9b3-8734f6944648","Type":"ContainerStarted","Data":"aee822e0ba236ea6567aff17e95e274802a39b2af23b5895504f6917d179db86"} Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.423953 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" podStartSLOduration=2.3688399 podStartE2EDuration="21.423910105s" podCreationTimestamp="2026-03-18 13:16:24 +0000 UTC" firstStartedPulling="2026-03-18 13:16:25.489248889 +0000 UTC m=+833.948676314" lastFinishedPulling="2026-03-18 13:16:44.544319094 +0000 UTC m=+853.003746519" observedRunningTime="2026-03-18 13:16:45.417621354 +0000 UTC m=+853.877048769" watchObservedRunningTime="2026-03-18 13:16:45.423910105 +0000 UTC m=+853.883337570" Mar 18 13:16:45 crc kubenswrapper[4912]: I0318 13:16:45.495550 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9lr46" podStartSLOduration=3.209292481 podStartE2EDuration="14.495529813s" podCreationTimestamp="2026-03-18 13:16:31 +0000 UTC" firstStartedPulling="2026-03-18 13:16:32.370241237 +0000 UTC m=+840.829668662" lastFinishedPulling="2026-03-18 13:16:43.656478569 +0000 UTC m=+852.115905994" observedRunningTime="2026-03-18 13:16:45.485679245 +0000 UTC m=+853.945106690" watchObservedRunningTime="2026-03-18 13:16:45.495529813 +0000 UTC m=+853.954957238" Mar 18 13:16:46 crc kubenswrapper[4912]: I0318 13:16:46.237272 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" path="/var/lib/kubelet/pods/1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6/volumes" Mar 18 13:16:46 crc kubenswrapper[4912]: I0318 13:16:46.410024 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerStarted","Data":"38de1eed43c41f7eee35dc47df268752be15959f6e96630c31216217683e7cf9"} Mar 18 13:16:47 crc kubenswrapper[4912]: I0318 13:16:47.420728 4912 generic.go:334] "Generic (PLEG): container finished" podID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerID="38de1eed43c41f7eee35dc47df268752be15959f6e96630c31216217683e7cf9" exitCode=0 Mar 18 13:16:47 crc kubenswrapper[4912]: I0318 13:16:47.420808 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerDied","Data":"38de1eed43c41f7eee35dc47df268752be15959f6e96630c31216217683e7cf9"} Mar 18 13:16:49 crc kubenswrapper[4912]: I0318 13:16:49.482392 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerStarted","Data":"250b5eaa48a4b71f597c8678d15d010caef39b4b7057db1d10044071b1c22dd5"} Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.360704 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7d7mh" podStartSLOduration=11.624630079 podStartE2EDuration="14.360678566s" podCreationTimestamp="2026-03-18 13:16:36 +0000 UTC" firstStartedPulling="2026-03-18 13:16:45.403319215 +0000 UTC m=+853.862746650" lastFinishedPulling="2026-03-18 13:16:48.139367712 +0000 UTC m=+856.598795137" observedRunningTime="2026-03-18 13:16:49.512951162 +0000 UTC m=+857.972378587" watchObservedRunningTime="2026-03-18 13:16:50.360678566 +0000 UTC m=+858.820105991" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.362024 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 18 13:16:50 crc kubenswrapper[4912]: E0318 13:16:50.362420 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="registry-server" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.362441 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="registry-server" Mar 18 13:16:50 crc kubenswrapper[4912]: E0318 13:16:50.362474 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="extract-content" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.362480 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="extract-content" Mar 18 13:16:50 crc kubenswrapper[4912]: E0318 13:16:50.362488 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="extract-utilities" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.362494 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="extract-utilities" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.362634 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd5614c-4be9-4f1c-acc5-dbe1fbedc0a6" containerName="registry-server" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.363215 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.365098 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.365133 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.367082 4912 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-7l6cl" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.372210 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.533478 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.533998 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplhb\" (UniqueName: \"kubernetes.io/projected/019e4147-01d3-4c53-9fe6-6172ac760254-kube-api-access-kplhb\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.636071 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.636206 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kplhb\" (UniqueName: \"kubernetes.io/projected/019e4147-01d3-4c53-9fe6-6172ac760254-kube-api-access-kplhb\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.663596 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.663646 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/247d98cf8c4a27de948431847d5ec8d6c7a9f5c2cb99921875e530794cc177eb/globalmount\"" pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.667249 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplhb\" (UniqueName: \"kubernetes.io/projected/019e4147-01d3-4c53-9fe6-6172ac760254-kube-api-access-kplhb\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:50 crc kubenswrapper[4912]: I0318 13:16:50.785871 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-839a8bb5-d08c-41f9-8e81-f2d46eb5656b\") pod \"minio\" (UID: \"019e4147-01d3-4c53-9fe6-6172ac760254\") " pod="minio-dev/minio" Mar 18 13:16:51 crc kubenswrapper[4912]: I0318 13:16:51.027352 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 13:16:51 crc kubenswrapper[4912]: I0318 13:16:51.498297 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.526357 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.530414 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"019e4147-01d3-4c53-9fe6-6172ac760254","Type":"ContainerStarted","Data":"107f6c5fa52a42e0faec2397fc2b760032272257b811b4b427bcdb6591fffe18"} Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.530622 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.539028 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.612354 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.612456 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.612499 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkccx\" (UniqueName: \"kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.713654 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.713740 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkccx\" (UniqueName: \"kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.713802 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.714356 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.714871 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.740440 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkccx\" (UniqueName: \"kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx\") pod \"redhat-marketplace-frlfc\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:52 crc kubenswrapper[4912]: I0318 13:16:52.914016 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:16:53 crc kubenswrapper[4912]: I0318 13:16:53.823814 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:16:54 crc kubenswrapper[4912]: I0318 13:16:54.550520 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerStarted","Data":"8ffc0fbfeea0d34fa966f891b21ebbfc7421a158cff820fc5adca74850e4389c"} Mar 18 13:16:55 crc kubenswrapper[4912]: I0318 13:16:55.564185 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerStarted","Data":"e6be9d8b9f835315aa23bddc52d101513e0dea27de66147766177e18ecbcaa92"} Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.573438 4912 generic.go:334] "Generic (PLEG): container finished" podID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerID="e6be9d8b9f835315aa23bddc52d101513e0dea27de66147766177e18ecbcaa92" exitCode=0 Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.573544 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerDied","Data":"e6be9d8b9f835315aa23bddc52d101513e0dea27de66147766177e18ecbcaa92"} Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.573948 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerStarted","Data":"eab2078d393e526b5787939b89608cbd9f430ca30448fb16455f9dbb5889a46c"} Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.575590 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"019e4147-01d3-4c53-9fe6-6172ac760254","Type":"ContainerStarted","Data":"2c034820be403ab863038fd1b3f893d28efe3b6363ec0feeb23317514fc2c8fc"} Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.615195 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.410538914 podStartE2EDuration="8.615170175s" podCreationTimestamp="2026-03-18 13:16:48 +0000 UTC" firstStartedPulling="2026-03-18 13:16:51.512921263 +0000 UTC m=+859.972348688" lastFinishedPulling="2026-03-18 13:16:55.717552524 +0000 UTC m=+864.176979949" observedRunningTime="2026-03-18 13:16:56.609280745 +0000 UTC m=+865.068708170" watchObservedRunningTime="2026-03-18 13:16:56.615170175 +0000 UTC m=+865.074597600" Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.866549 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.867024 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:56 crc kubenswrapper[4912]: I0318 13:16:56.912261 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:57 crc kubenswrapper[4912]: I0318 13:16:57.585909 4912 generic.go:334] "Generic (PLEG): container finished" podID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerID="eab2078d393e526b5787939b89608cbd9f430ca30448fb16455f9dbb5889a46c" exitCode=0 Mar 18 13:16:57 crc kubenswrapper[4912]: I0318 13:16:57.586014 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerDied","Data":"eab2078d393e526b5787939b89608cbd9f430ca30448fb16455f9dbb5889a46c"} Mar 18 13:16:57 crc kubenswrapper[4912]: I0318 13:16:57.645499 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:16:58 crc kubenswrapper[4912]: I0318 13:16:58.595104 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerStarted","Data":"62b018b91e5cdb81f6ae4e5674863e57af43be620fe125f7316c716be4c66197"} Mar 18 13:16:58 crc kubenswrapper[4912]: I0318 13:16:58.627239 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frlfc" podStartSLOduration=4.112921501 podStartE2EDuration="6.627214275s" podCreationTimestamp="2026-03-18 13:16:52 +0000 UTC" firstStartedPulling="2026-03-18 13:16:55.630687801 +0000 UTC m=+864.090115226" lastFinishedPulling="2026-03-18 13:16:58.144980575 +0000 UTC m=+866.604408000" observedRunningTime="2026-03-18 13:16:58.624868451 +0000 UTC m=+867.084295876" watchObservedRunningTime="2026-03-18 13:16:58.627214275 +0000 UTC m=+867.086641700" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.565142 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.567081 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.570482 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.570776 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.572864 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.577729 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.577755 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-x87t9" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.625779 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.686455 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-config\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.686546 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.686601 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.686638 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.686701 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5lcm\" (UniqueName: \"kubernetes.io/projected/cd78a5ca-41b5-48af-a603-0ac01cbde069-kube-api-access-x5lcm\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.715577 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.715918 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7d7mh" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="registry-server" containerID="cri-o://250b5eaa48a4b71f597c8678d15d010caef39b4b7057db1d10044071b1c22dd5" gracePeriod=2 Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.788493 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5lcm\" (UniqueName: \"kubernetes.io/projected/cd78a5ca-41b5-48af-a603-0ac01cbde069-kube-api-access-x5lcm\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.788589 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-config\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.788644 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.788711 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.789721 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.789935 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-config\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.790459 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.795517 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.796510 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.807366 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.807466 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.807533 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.807704 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.813657 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd78a5ca-41b5-48af-a603-0ac01cbde069-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.826498 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5lcm\" (UniqueName: \"kubernetes.io/projected/cd78a5ca-41b5-48af-a603-0ac01cbde069-kube-api-access-x5lcm\") pod \"logging-loki-distributor-9c6b6d984-s2ztv\" (UID: \"cd78a5ca-41b5-48af-a603-0ac01cbde069\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.875950 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893627 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cmx\" (UniqueName: \"kubernetes.io/projected/d49e1c94-aaf7-4502-ad56-46296a08cf03-kube-api-access-r8cmx\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893713 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893748 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-config\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893765 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893792 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.893827 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.929564 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4"] Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.930903 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.938577 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.938754 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.944781 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:01 crc kubenswrapper[4912]: I0318 13:17:01.951543 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.011401 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cmx\" (UniqueName: \"kubernetes.io/projected/d49e1c94-aaf7-4502-ad56-46296a08cf03-kube-api-access-r8cmx\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.011938 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-config\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012009 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012123 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012167 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffhn\" (UniqueName: \"kubernetes.io/projected/95f374dc-f34c-48df-a280-3434f082b6d0-kube-api-access-mffhn\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012249 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-config\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012279 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012331 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012423 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012471 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.012607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.021971 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.035183 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d49e1c94-aaf7-4502-ad56-46296a08cf03-config\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.043928 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.044313 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cmx\" (UniqueName: \"kubernetes.io/projected/d49e1c94-aaf7-4502-ad56-46296a08cf03-kube-api-access-r8cmx\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.045018 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.050247 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/d49e1c94-aaf7-4502-ad56-46296a08cf03-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-jgbgh\" (UID: \"d49e1c94-aaf7-4502-ad56-46296a08cf03\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.082582 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.084456 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.091980 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.092856 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.093819 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-p5848" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.094958 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.095397 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.095861 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.095882 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.096176 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.098579 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.110822 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.114225 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.114303 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-config\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.114353 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.114402 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mffhn\" (UniqueName: \"kubernetes.io/projected/95f374dc-f34c-48df-a280-3434f082b6d0-kube-api-access-mffhn\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.114477 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.118011 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.118818 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.119586 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f374dc-f34c-48df-a280-3434f082b6d0-config\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.129374 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/95f374dc-f34c-48df-a280-3434f082b6d0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.148649 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffhn\" (UniqueName: \"kubernetes.io/projected/95f374dc-f34c-48df-a280-3434f082b6d0-kube-api-access-mffhn\") pod \"logging-loki-query-frontend-ff66c4dc9-tcbf4\" (UID: \"95f374dc-f34c-48df-a280-3434f082b6d0\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.179492 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.215843 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.215891 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216034 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216185 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-rbac\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216221 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwf4\" (UniqueName: \"kubernetes.io/projected/22169096-dc0c-47ea-a40e-728cac38c1d4-kube-api-access-dlwf4\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216252 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tenants\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216271 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216627 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216682 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216729 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.216973 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tenants\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.217006 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-rbac\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.217135 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.217178 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rfn\" (UniqueName: \"kubernetes.io/projected/86abc7c8-2019-4f25-84a0-f764bc3f10d6-kube-api-access-r8rfn\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.217205 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.266020 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.323167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rfn\" (UniqueName: \"kubernetes.io/projected/86abc7c8-2019-4f25-84a0-f764bc3f10d6-kube-api-access-r8rfn\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.323238 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.328791 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.328869 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.328897 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.328976 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-rbac\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329016 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwf4\" (UniqueName: \"kubernetes.io/projected/22169096-dc0c-47ea-a40e-728cac38c1d4-kube-api-access-dlwf4\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tenants\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329087 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329218 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329272 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329326 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329417 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329448 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tenants\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329478 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-rbac\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.329573 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.330613 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-rbac\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.330896 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.331338 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.332359 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.332392 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.332501 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.333495 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.333875 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-rbac\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.337555 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tenants\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.339198 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.342553 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-tenants\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.343821 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/86abc7c8-2019-4f25-84a0-f764bc3f10d6-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.344834 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/22169096-dc0c-47ea-a40e-728cac38c1d4-lokistack-gateway\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.345498 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/22169096-dc0c-47ea-a40e-728cac38c1d4-tls-secret\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.354374 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rfn\" (UniqueName: \"kubernetes.io/projected/86abc7c8-2019-4f25-84a0-f764bc3f10d6-kube-api-access-r8rfn\") pod \"logging-loki-gateway-b5bdf65c4-vqfpz\" (UID: \"86abc7c8-2019-4f25-84a0-f764bc3f10d6\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.367156 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwf4\" (UniqueName: \"kubernetes.io/projected/22169096-dc0c-47ea-a40e-728cac38c1d4-kube-api-access-dlwf4\") pod \"logging-loki-gateway-b5bdf65c4-ldbjt\" (UID: \"22169096-dc0c-47ea-a40e-728cac38c1d4\") " pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.431067 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.452164 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.557424 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.581717 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.652977 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" event={"ID":"cd78a5ca-41b5-48af-a603-0ac01cbde069","Type":"ContainerStarted","Data":"c38a9b4b5ce0a32fc25a770adf1c549edc2406b8ad889b9cde11bdf56f983798"} Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.835286 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.840916 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.849219 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.854414 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.854543 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.916541 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.918442 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.957668 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958500 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958577 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958623 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxgkt\" (UniqueName: \"kubernetes.io/projected/9951708d-b5a5-4dea-9cb5-a89c96f2a404-kube-api-access-zxgkt\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958680 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958767 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958836 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.958919 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-config\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.976115 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.980100 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.985977 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 18 13:17:02 crc kubenswrapper[4912]: I0318 13:17:02.986267 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.014861 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.022091 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.031937 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.044753 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.051393 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.052730 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.055228 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.055425 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063184 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063289 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063332 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063407 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063442 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063472 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063519 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063551 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-config\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063590 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063614 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxgkt\" (UniqueName: \"kubernetes.io/projected/9951708d-b5a5-4dea-9cb5-a89c96f2a404-kube-api-access-zxgkt\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063635 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063667 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063705 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-config\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063795 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvbcm\" (UniqueName: \"kubernetes.io/projected/78d0ba71-aecb-4e22-a459-c5f690268e0e-kube-api-access-hvbcm\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.063828 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.066411 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-config\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.067330 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.076196 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.076224 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.079163 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.079245 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb280ea0dbd1a72303153671da4930499d49dac6d31e30fc10d2c0ab2925b4fd/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.080071 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.080125 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c83f139cb3e007e7a8db313b6c465ff15a18881d95b3ae5451c83c202bbe953/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.083312 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.088115 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9951708d-b5a5-4dea-9cb5-a89c96f2a404-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.092033 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.094888 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxgkt\" (UniqueName: \"kubernetes.io/projected/9951708d-b5a5-4dea-9cb5-a89c96f2a404-kube-api-access-zxgkt\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.112236 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78ecbdff-99dc-4c8f-a68f-53f1565a170f\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.117155 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a753d979-56ef-4bd1-83cd-07087006d1f2\") pod \"logging-loki-ingester-0\" (UID: \"9951708d-b5a5-4dea-9cb5-a89c96f2a404\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165326 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165394 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165424 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165457 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-config\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165482 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjnm\" (UniqueName: \"kubernetes.io/projected/5c4fd206-4176-47ef-9cee-8be6e9ed396f-kube-api-access-rkjnm\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165515 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165540 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165583 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165615 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165640 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvbcm\" (UniqueName: \"kubernetes.io/projected/78d0ba71-aecb-4e22-a459-c5f690268e0e-kube-api-access-hvbcm\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165662 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165684 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165712 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.165755 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.167838 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-config\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.171285 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.171960 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.175626 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.175826 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcfb52fa4b78425aff5528620f095e20bb273189650823e40f107dc5f68f2724/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.185875 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.187380 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/78d0ba71-aecb-4e22-a459-c5f690268e0e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.188271 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvbcm\" (UniqueName: \"kubernetes.io/projected/78d0ba71-aecb-4e22-a459-c5f690268e0e-kube-api-access-hvbcm\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.195458 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.206763 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fead652-bd08-4c9d-980a-e6862a1633b2\") pod \"logging-loki-compactor-0\" (UID: \"78d0ba71-aecb-4e22-a459-c5f690268e0e\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.267003 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.272466 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273253 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273355 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273262 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0f529809f0036f9571914015faee2420d1864c8905a924fe6de86a8b214ec055/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273641 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjnm\" (UniqueName: \"kubernetes.io/projected/5c4fd206-4176-47ef-9cee-8be6e9ed396f-kube-api-access-rkjnm\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273749 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273820 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.273907 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.274654 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.275030 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.279000 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.279558 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.282153 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5c4fd206-4176-47ef-9cee-8be6e9ed396f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.291654 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjnm\" (UniqueName: \"kubernetes.io/projected/5c4fd206-4176-47ef-9cee-8be6e9ed396f-kube-api-access-rkjnm\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.329144 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.331620 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9dadbc8-d285-4ae7-be53-c5734be30a98\") pod \"logging-loki-index-gateway-0\" (UID: \"5c4fd206-4176-47ef-9cee-8be6e9ed396f\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.396271 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.502788 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.694645 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" event={"ID":"86abc7c8-2019-4f25-84a0-f764bc3f10d6","Type":"ContainerStarted","Data":"cd9b2c8af11996cbfd6e693f918656a61ffbb4953f249a1717bd42755a1e7f3a"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.726609 4912 generic.go:334] "Generic (PLEG): container finished" podID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerID="250b5eaa48a4b71f597c8678d15d010caef39b4b7057db1d10044071b1c22dd5" exitCode=0 Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.726761 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerDied","Data":"250b5eaa48a4b71f597c8678d15d010caef39b4b7057db1d10044071b1c22dd5"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.728236 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" event={"ID":"d49e1c94-aaf7-4502-ad56-46296a08cf03","Type":"ContainerStarted","Data":"fab515fd903c3c3b8356c78ae357440760b1e4b38f20e5fdcb8d02b95702a300"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.729498 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" event={"ID":"22169096-dc0c-47ea-a40e-728cac38c1d4","Type":"ContainerStarted","Data":"6c05c14bdaa708c6e6311263b3400e1a525985b9b66cc951790b69adfc6b3697"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.730472 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" event={"ID":"95f374dc-f34c-48df-a280-3434f082b6d0","Type":"ContainerStarted","Data":"7514618c965f16e857adfc32aff0900091a2c22a08fa95746a14c4c8efb80912"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.731577 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"9951708d-b5a5-4dea-9cb5-a89c96f2a404","Type":"ContainerStarted","Data":"894914105aaffd48ef579d75ed5292588efa7df2cd7805336b1c82d1d5cecf99"} Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.829897 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:03 crc kubenswrapper[4912]: I0318 13:17:03.961652 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 13:17:03 crc kubenswrapper[4912]: W0318 13:17:03.972721 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78d0ba71_aecb_4e22_a459_c5f690268e0e.slice/crio-6843e11568be0af2d3f87c179c24c9cd551ab3532e35f470d63b7e1b31d98c7d WatchSource:0}: Error finding container 6843e11568be0af2d3f87c179c24c9cd551ab3532e35f470d63b7e1b31d98c7d: Status 404 returned error can't find the container with id 6843e11568be0af2d3f87c179c24c9cd551ab3532e35f470d63b7e1b31d98c7d Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.115376 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.120695 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:17:04 crc kubenswrapper[4912]: W0318 13:17:04.125793 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c4fd206_4176_47ef_9cee_8be6e9ed396f.slice/crio-78ce5575b345982fcd479d39880e55aedc6a6bc12dd16da7179cfdf357ec2441 WatchSource:0}: Error finding container 78ce5575b345982fcd479d39880e55aedc6a6bc12dd16da7179cfdf357ec2441: Status 404 returned error can't find the container with id 78ce5575b345982fcd479d39880e55aedc6a6bc12dd16da7179cfdf357ec2441 Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.197480 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqc48\" (UniqueName: \"kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48\") pod \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.197611 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content\") pod \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.197688 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities\") pod \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\" (UID: \"d8e0c42c-0552-4b21-a68d-ccdb0e166896\") " Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.199081 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities" (OuterVolumeSpecName: "utilities") pod "d8e0c42c-0552-4b21-a68d-ccdb0e166896" (UID: "d8e0c42c-0552-4b21-a68d-ccdb0e166896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.203902 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48" (OuterVolumeSpecName: "kube-api-access-dqc48") pod "d8e0c42c-0552-4b21-a68d-ccdb0e166896" (UID: "d8e0c42c-0552-4b21-a68d-ccdb0e166896"). InnerVolumeSpecName "kube-api-access-dqc48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.251728 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8e0c42c-0552-4b21-a68d-ccdb0e166896" (UID: "d8e0c42c-0552-4b21-a68d-ccdb0e166896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.300439 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.300481 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0c42c-0552-4b21-a68d-ccdb0e166896-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.300509 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqc48\" (UniqueName: \"kubernetes.io/projected/d8e0c42c-0552-4b21-a68d-ccdb0e166896-kube-api-access-dqc48\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.517084 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.756022 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"5c4fd206-4176-47ef-9cee-8be6e9ed396f","Type":"ContainerStarted","Data":"78ce5575b345982fcd479d39880e55aedc6a6bc12dd16da7179cfdf357ec2441"} Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.763388 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7d7mh" event={"ID":"d8e0c42c-0552-4b21-a68d-ccdb0e166896","Type":"ContainerDied","Data":"b1347cf789e89ff2e97b950f95f37b0fa714fda5c4e7145bae17903f1224e20e"} Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.763467 4912 scope.go:117] "RemoveContainer" containerID="250b5eaa48a4b71f597c8678d15d010caef39b4b7057db1d10044071b1c22dd5" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.763481 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7d7mh" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.765737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"78d0ba71-aecb-4e22-a459-c5f690268e0e","Type":"ContainerStarted","Data":"6843e11568be0af2d3f87c179c24c9cd551ab3532e35f470d63b7e1b31d98c7d"} Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.787142 4912 scope.go:117] "RemoveContainer" containerID="38de1eed43c41f7eee35dc47df268752be15959f6e96630c31216217683e7cf9" Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.799775 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.809817 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7d7mh"] Mar 18 13:17:04 crc kubenswrapper[4912]: I0318 13:17:04.815600 4912 scope.go:117] "RemoveContainer" containerID="a5afdfa46683c4eec3a485625d2065527428cc7886e6d257e1d056500ca27fa8" Mar 18 13:17:05 crc kubenswrapper[4912]: I0318 13:17:05.781118 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frlfc" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="registry-server" containerID="cri-o://62b018b91e5cdb81f6ae4e5674863e57af43be620fe125f7316c716be4c66197" gracePeriod=2 Mar 18 13:17:06 crc kubenswrapper[4912]: I0318 13:17:06.242989 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" path="/var/lib/kubelet/pods/d8e0c42c-0552-4b21-a68d-ccdb0e166896/volumes" Mar 18 13:17:06 crc kubenswrapper[4912]: I0318 13:17:06.793495 4912 generic.go:334] "Generic (PLEG): container finished" podID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerID="62b018b91e5cdb81f6ae4e5674863e57af43be620fe125f7316c716be4c66197" exitCode=0 Mar 18 13:17:06 crc kubenswrapper[4912]: I0318 13:17:06.793592 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerDied","Data":"62b018b91e5cdb81f6ae4e5674863e57af43be620fe125f7316c716be4c66197"} Mar 18 13:17:06 crc kubenswrapper[4912]: I0318 13:17:06.998528 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:17:06 crc kubenswrapper[4912]: I0318 13:17:06.998614 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.017915 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.184710 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkccx\" (UniqueName: \"kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx\") pod \"44246303-6676-4025-88dc-52ad0ec0ba5e\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.184971 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities\") pod \"44246303-6676-4025-88dc-52ad0ec0ba5e\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.185130 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content\") pod \"44246303-6676-4025-88dc-52ad0ec0ba5e\" (UID: \"44246303-6676-4025-88dc-52ad0ec0ba5e\") " Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.186220 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities" (OuterVolumeSpecName: "utilities") pod "44246303-6676-4025-88dc-52ad0ec0ba5e" (UID: "44246303-6676-4025-88dc-52ad0ec0ba5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.192288 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx" (OuterVolumeSpecName: "kube-api-access-gkccx") pod "44246303-6676-4025-88dc-52ad0ec0ba5e" (UID: "44246303-6676-4025-88dc-52ad0ec0ba5e"). InnerVolumeSpecName "kube-api-access-gkccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.215900 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44246303-6676-4025-88dc-52ad0ec0ba5e" (UID: "44246303-6676-4025-88dc-52ad0ec0ba5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.286725 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.286769 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkccx\" (UniqueName: \"kubernetes.io/projected/44246303-6676-4025-88dc-52ad0ec0ba5e-kube-api-access-gkccx\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.286786 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44246303-6676-4025-88dc-52ad0ec0ba5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.815544 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" event={"ID":"95f374dc-f34c-48df-a280-3434f082b6d0","Type":"ContainerStarted","Data":"b985707d0e8344c65b8233e26d03c700009f15bd9728c3cd3faf95f6e7130ae8"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.815725 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.820545 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frlfc" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.820593 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frlfc" event={"ID":"44246303-6676-4025-88dc-52ad0ec0ba5e","Type":"ContainerDied","Data":"8ffc0fbfeea0d34fa966f891b21ebbfc7421a158cff820fc5adca74850e4389c"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.820768 4912 scope.go:117] "RemoveContainer" containerID="62b018b91e5cdb81f6ae4e5674863e57af43be620fe125f7316c716be4c66197" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.823303 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"9951708d-b5a5-4dea-9cb5-a89c96f2a404","Type":"ContainerStarted","Data":"d133240ac145ed0012a9d472c97fb342ac0d31f07825bc0d9eb8ba27e22d04d5"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.823498 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.825323 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" event={"ID":"86abc7c8-2019-4f25-84a0-f764bc3f10d6","Type":"ContainerStarted","Data":"97da447b50719d3a4eba1809b894c1d14d6a7cc1b9b3417a0e98e71cce90d0be"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.826803 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"78d0ba71-aecb-4e22-a459-c5f690268e0e","Type":"ContainerStarted","Data":"fbaaf91f9021b53712e48c449e98d6f71e65b45f878e30d7957b144e0aa4c227"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.827361 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.830422 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" event={"ID":"22169096-dc0c-47ea-a40e-728cac38c1d4","Type":"ContainerStarted","Data":"ad0a8fac7e6818a4d2013c44341c218ddcb423977f73506943822073e53dc3c6"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.833529 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"5c4fd206-4176-47ef-9cee-8be6e9ed396f","Type":"ContainerStarted","Data":"545dfc943a94adf00a75c042702cd96b4d54b0b53ec8bc9b7ccd96ee9dd065dd"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.833895 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.838182 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podStartSLOduration=2.675241791 podStartE2EDuration="7.838157384s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:02.976955524 +0000 UTC m=+871.436382939" lastFinishedPulling="2026-03-18 13:17:08.139871107 +0000 UTC m=+876.599298532" observedRunningTime="2026-03-18 13:17:08.831351768 +0000 UTC m=+877.290779273" watchObservedRunningTime="2026-03-18 13:17:08.838157384 +0000 UTC m=+877.297584809" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.847496 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" event={"ID":"d49e1c94-aaf7-4502-ad56-46296a08cf03","Type":"ContainerStarted","Data":"cce4bc31adbe4c176cb59089355c8f7feef7324a22ff7672e6f72a5dc5dc81e6"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.847825 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.855449 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.286796268 podStartE2EDuration="7.855429693s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:03.568085606 +0000 UTC m=+872.027513031" lastFinishedPulling="2026-03-18 13:17:08.136719031 +0000 UTC m=+876.596146456" observedRunningTime="2026-03-18 13:17:08.854642182 +0000 UTC m=+877.314069637" watchObservedRunningTime="2026-03-18 13:17:08.855429693 +0000 UTC m=+877.314857118" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.861313 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" event={"ID":"cd78a5ca-41b5-48af-a603-0ac01cbde069","Type":"ContainerStarted","Data":"07f04d3066b71c0fd7443910b4cf17afacb563788d99adbb2f7eb3c9f73cdd3e"} Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.861721 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.869313 4912 scope.go:117] "RemoveContainer" containerID="eab2078d393e526b5787939b89608cbd9f430ca30448fb16455f9dbb5889a46c" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.896793 4912 scope.go:117] "RemoveContainer" containerID="e6be9d8b9f835315aa23bddc52d101513e0dea27de66147766177e18ecbcaa92" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.918830 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podStartSLOduration=2.472409332 podStartE2EDuration="7.918805898s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:02.69269238 +0000 UTC m=+871.152119805" lastFinishedPulling="2026-03-18 13:17:08.139088946 +0000 UTC m=+876.598516371" observedRunningTime="2026-03-18 13:17:08.913590846 +0000 UTC m=+877.373018281" watchObservedRunningTime="2026-03-18 13:17:08.918805898 +0000 UTC m=+877.378233323" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.919829 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.815387439 podStartE2EDuration="7.919815835s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:03.976336963 +0000 UTC m=+872.435764408" lastFinishedPulling="2026-03-18 13:17:08.080765379 +0000 UTC m=+876.540192804" observedRunningTime="2026-03-18 13:17:08.886379086 +0000 UTC m=+877.345806531" watchObservedRunningTime="2026-03-18 13:17:08.919815835 +0000 UTC m=+877.379243270" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.931913 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.938549 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frlfc"] Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.954451 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podStartSLOduration=2.421664532 podStartE2EDuration="7.954424567s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:02.606259549 +0000 UTC m=+871.065686974" lastFinishedPulling="2026-03-18 13:17:08.139019584 +0000 UTC m=+876.598447009" observedRunningTime="2026-03-18 13:17:08.947378665 +0000 UTC m=+877.406806100" watchObservedRunningTime="2026-03-18 13:17:08.954424567 +0000 UTC m=+877.413851992" Mar 18 13:17:08 crc kubenswrapper[4912]: I0318 13:17:08.970241 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.958869623 podStartE2EDuration="7.970214816s" podCreationTimestamp="2026-03-18 13:17:01 +0000 UTC" firstStartedPulling="2026-03-18 13:17:04.128889044 +0000 UTC m=+872.588316469" lastFinishedPulling="2026-03-18 13:17:08.140234237 +0000 UTC m=+876.599661662" observedRunningTime="2026-03-18 13:17:08.964147531 +0000 UTC m=+877.423574976" watchObservedRunningTime="2026-03-18 13:17:08.970214816 +0000 UTC m=+877.429642241" Mar 18 13:17:10 crc kubenswrapper[4912]: I0318 13:17:10.241527 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" path="/var/lib/kubelet/pods/44246303-6676-4025-88dc-52ad0ec0ba5e/volumes" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.889988 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" event={"ID":"86abc7c8-2019-4f25-84a0-f764bc3f10d6","Type":"ContainerStarted","Data":"42d2d6b41dbf9cdc64e6092f905b41cd8ecde2f1b893972c984a48660f558d82"} Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.890361 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.890946 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.895462 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" event={"ID":"22169096-dc0c-47ea-a40e-728cac38c1d4","Type":"ContainerStarted","Data":"17269689ff72228ed0529efbdbe41ab598d7817f612883ed2083c6d880991a02"} Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.895861 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.896015 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.905764 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.908259 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.908509 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.915789 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.924102 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podStartSLOduration=2.219614652 podStartE2EDuration="9.9240753s" podCreationTimestamp="2026-03-18 13:17:02 +0000 UTC" firstStartedPulling="2026-03-18 13:17:03.039479555 +0000 UTC m=+871.498906980" lastFinishedPulling="2026-03-18 13:17:10.743940203 +0000 UTC m=+879.203367628" observedRunningTime="2026-03-18 13:17:11.91817962 +0000 UTC m=+880.377607065" watchObservedRunningTime="2026-03-18 13:17:11.9240753 +0000 UTC m=+880.383502725" Mar 18 13:17:11 crc kubenswrapper[4912]: I0318 13:17:11.955019 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podStartSLOduration=2.201968632 podStartE2EDuration="9.954990621s" podCreationTimestamp="2026-03-18 13:17:02 +0000 UTC" firstStartedPulling="2026-03-18 13:17:02.976934453 +0000 UTC m=+871.436361878" lastFinishedPulling="2026-03-18 13:17:10.729956442 +0000 UTC m=+879.189383867" observedRunningTime="2026-03-18 13:17:11.945923654 +0000 UTC m=+880.405351119" watchObservedRunningTime="2026-03-18 13:17:11.954990621 +0000 UTC m=+880.414418046" Mar 18 13:17:23 crc kubenswrapper[4912]: I0318 13:17:23.200681 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 18 13:17:23 crc kubenswrapper[4912]: I0318 13:17:23.201480 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:17:23 crc kubenswrapper[4912]: I0318 13:17:23.339943 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 18 13:17:23 crc kubenswrapper[4912]: I0318 13:17:23.406492 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 13:17:31 crc kubenswrapper[4912]: I0318 13:17:31.954705 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 13:17:32 crc kubenswrapper[4912]: I0318 13:17:32.186395 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 13:17:32 crc kubenswrapper[4912]: I0318 13:17:32.281136 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 13:17:33 crc kubenswrapper[4912]: I0318 13:17:33.204449 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 18 13:17:33 crc kubenswrapper[4912]: I0318 13:17:33.205013 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:17:36 crc kubenswrapper[4912]: I0318 13:17:36.999385 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:17:37 crc kubenswrapper[4912]: I0318 13:17:37.000027 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:17:43 crc kubenswrapper[4912]: I0318 13:17:43.200530 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 18 13:17:43 crc kubenswrapper[4912]: I0318 13:17:43.201401 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:17:53 crc kubenswrapper[4912]: I0318 13:17:53.201431 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 18 13:17:53 crc kubenswrapper[4912]: I0318 13:17:53.202265 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.138117 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563998-cjg4w"] Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139119 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="extract-content" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139131 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="extract-content" Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139146 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="extract-utilities" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139152 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="extract-utilities" Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139162 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139168 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139180 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="extract-content" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139185 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="extract-content" Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139203 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139211 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: E0318 13:18:00.139225 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="extract-utilities" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139231 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="extract-utilities" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139367 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="44246303-6676-4025-88dc-52ad0ec0ba5e" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139384 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e0c42c-0552-4b21-a68d-ccdb0e166896" containerName="registry-server" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.139959 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.143824 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.144929 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.145613 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.149075 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-cjg4w"] Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.294583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zj5\" (UniqueName: \"kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5\") pod \"auto-csr-approver-29563998-cjg4w\" (UID: \"9395a26a-9b3b-4280-adc8-5d8d4f193d5f\") " pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.397050 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zj5\" (UniqueName: \"kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5\") pod \"auto-csr-approver-29563998-cjg4w\" (UID: \"9395a26a-9b3b-4280-adc8-5d8d4f193d5f\") " pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.425619 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zj5\" (UniqueName: \"kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5\") pod \"auto-csr-approver-29563998-cjg4w\" (UID: \"9395a26a-9b3b-4280-adc8-5d8d4f193d5f\") " pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.515809 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:00 crc kubenswrapper[4912]: I0318 13:18:00.969445 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-cjg4w"] Mar 18 13:18:01 crc kubenswrapper[4912]: I0318 13:18:01.266010 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" event={"ID":"9395a26a-9b3b-4280-adc8-5d8d4f193d5f","Type":"ContainerStarted","Data":"4ffa32272d37df109dc094c27f7e53f8eece32f8c3e706806ae56de3138e8389"} Mar 18 13:18:03 crc kubenswrapper[4912]: I0318 13:18:03.202928 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 18 13:18:03 crc kubenswrapper[4912]: I0318 13:18:03.293333 4912 generic.go:334] "Generic (PLEG): container finished" podID="9395a26a-9b3b-4280-adc8-5d8d4f193d5f" containerID="786a714477551d9d646014e23a3cad3d41f11842b7c0ed76b3216e04a5ddf5d5" exitCode=0 Mar 18 13:18:03 crc kubenswrapper[4912]: I0318 13:18:03.293386 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" event={"ID":"9395a26a-9b3b-4280-adc8-5d8d4f193d5f","Type":"ContainerDied","Data":"786a714477551d9d646014e23a3cad3d41f11842b7c0ed76b3216e04a5ddf5d5"} Mar 18 13:18:04 crc kubenswrapper[4912]: I0318 13:18:04.564840 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:04 crc kubenswrapper[4912]: I0318 13:18:04.686648 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42zj5\" (UniqueName: \"kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5\") pod \"9395a26a-9b3b-4280-adc8-5d8d4f193d5f\" (UID: \"9395a26a-9b3b-4280-adc8-5d8d4f193d5f\") " Mar 18 13:18:04 crc kubenswrapper[4912]: I0318 13:18:04.696356 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5" (OuterVolumeSpecName: "kube-api-access-42zj5") pod "9395a26a-9b3b-4280-adc8-5d8d4f193d5f" (UID: "9395a26a-9b3b-4280-adc8-5d8d4f193d5f"). InnerVolumeSpecName "kube-api-access-42zj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:04 crc kubenswrapper[4912]: I0318 13:18:04.789105 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42zj5\" (UniqueName: \"kubernetes.io/projected/9395a26a-9b3b-4280-adc8-5d8d4f193d5f-kube-api-access-42zj5\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:05 crc kubenswrapper[4912]: I0318 13:18:05.309988 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" event={"ID":"9395a26a-9b3b-4280-adc8-5d8d4f193d5f","Type":"ContainerDied","Data":"4ffa32272d37df109dc094c27f7e53f8eece32f8c3e706806ae56de3138e8389"} Mar 18 13:18:05 crc kubenswrapper[4912]: I0318 13:18:05.310027 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-cjg4w" Mar 18 13:18:05 crc kubenswrapper[4912]: I0318 13:18:05.310069 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ffa32272d37df109dc094c27f7e53f8eece32f8c3e706806ae56de3138e8389" Mar 18 13:18:05 crc kubenswrapper[4912]: I0318 13:18:05.635733 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-b5clf"] Mar 18 13:18:05 crc kubenswrapper[4912]: I0318 13:18:05.641446 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-b5clf"] Mar 18 13:18:06 crc kubenswrapper[4912]: I0318 13:18:06.240454 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8c4d7d-23d6-4ee7-afe7-86295ac42fea" path="/var/lib/kubelet/pods/5f8c4d7d-23d6-4ee7-afe7-86295ac42fea/volumes" Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.000007 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.000149 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.000211 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.001227 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.001304 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459" gracePeriod=600 Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.331395 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459" exitCode=0 Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.331472 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459"} Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.331898 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df"} Mar 18 13:18:07 crc kubenswrapper[4912]: I0318 13:18:07.331926 4912 scope.go:117] "RemoveContainer" containerID="49978836919fd8cfdafc75ececc71c0f4203bbe58b30f6727a2f99ecffd2ea26" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.877407 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-d9cc2"] Mar 18 13:18:22 crc kubenswrapper[4912]: E0318 13:18:22.878947 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9395a26a-9b3b-4280-adc8-5d8d4f193d5f" containerName="oc" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.878971 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9395a26a-9b3b-4280-adc8-5d8d4f193d5f" containerName="oc" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.879175 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9395a26a-9b3b-4280-adc8-5d8d4f193d5f" containerName="oc" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.879937 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.888559 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.888694 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-jvrgc" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.889265 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-d9cc2"] Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.888720 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.888766 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.891263 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.897327 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.943990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944214 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944273 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944343 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944428 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pf7m\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944466 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944508 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944528 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944556 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944581 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:22 crc kubenswrapper[4912]: I0318 13:18:22.944629 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.035885 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-d9cc2"] Mar 18 13:18:23 crc kubenswrapper[4912]: E0318 13:18:23.036699 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-6pf7m metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-d9cc2" podUID="66713b6f-a8f6-4118-b6f4-19012e591e7a" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046341 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pf7m\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046399 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046442 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046463 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046484 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046507 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046543 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046563 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046622 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.046656 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.047161 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: E0318 13:18:23.047265 4912 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 18 13:18:23 crc kubenswrapper[4912]: E0318 13:18:23.047406 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver podName:66713b6f-a8f6-4118-b6f4-19012e591e7a nodeName:}" failed. No retries permitted until 2026-03-18 13:18:23.547370024 +0000 UTC m=+952.006797449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver") pod "collector-d9cc2" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a") : secret "collector-syslog-receiver" not found Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.047700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: E0318 13:18:23.047814 4912 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 18 13:18:23 crc kubenswrapper[4912]: E0318 13:18:23.047868 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics podName:66713b6f-a8f6-4118-b6f4-19012e591e7a nodeName:}" failed. No retries permitted until 2026-03-18 13:18:23.547852997 +0000 UTC m=+952.007280622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics") pod "collector-d9cc2" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a") : secret "collector-metrics" not found Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.048589 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.048794 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.049756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.056770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.056871 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.066561 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.069519 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pf7m\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.478984 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.497176 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.554543 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.554852 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pf7m\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.555009 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.554664 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir" (OuterVolumeSpecName: "datadir") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.555305 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.555718 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.555824 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556005 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556196 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556293 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556702 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.557070 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556304 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config" (OuterVolumeSpecName: "config") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556524 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.556842 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.557020 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.557485 4912 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/66713b6f-a8f6-4118-b6f4-19012e591e7a-datadir\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.558329 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token" (OuterVolumeSpecName: "collector-token") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.558348 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m" (OuterVolumeSpecName: "kube-api-access-6pf7m") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "kube-api-access-6pf7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.558608 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token" (OuterVolumeSpecName: "sa-token") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.559361 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp" (OuterVolumeSpecName: "tmp") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.560535 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.561834 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") pod \"collector-d9cc2\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " pod="openshift-logging/collector-d9cc2" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659185 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659287 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") pod \"66713b6f-a8f6-4118-b6f4-19012e591e7a\" (UID: \"66713b6f-a8f6-4118-b6f4-19012e591e7a\") " Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659893 4912 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659921 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659933 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659943 4912 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66713b6f-a8f6-4118-b6f4-19012e591e7a-tmp\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659951 4912 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659960 4912 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/66713b6f-a8f6-4118-b6f4-19012e591e7a-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659972 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pf7m\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-kube-api-access-6pf7m\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.659981 4912 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/66713b6f-a8f6-4118-b6f4-19012e591e7a-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.662504 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics" (OuterVolumeSpecName: "metrics") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.662669 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "66713b6f-a8f6-4118-b6f4-19012e591e7a" (UID: "66713b6f-a8f6-4118-b6f4-19012e591e7a"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.762529 4912 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:23 crc kubenswrapper[4912]: I0318 13:18:23.762580 4912 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/66713b6f-a8f6-4118-b6f4-19012e591e7a-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.486396 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-d9cc2" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.537985 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-d9cc2"] Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.548504 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-d9cc2"] Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.557531 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-zx7v8"] Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.561292 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.564389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-jvrgc" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.564634 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.573447 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.573526 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.573579 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.574815 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zx7v8"] Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.575792 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.680944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7853aab4-c5b8-400c-9b11-80102982ddd3-tmp\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.681433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56r4x\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-kube-api-access-56r4x\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.681632 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-entrypoint\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.681745 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/7853aab4-c5b8-400c-9b11-80102982ddd3-datadir\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.681865 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.681992 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-syslog-receiver\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.682121 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.682393 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-trusted-ca\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.682481 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config-openshift-service-cacrt\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.682515 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-metrics\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.682563 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-sa-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.784975 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-metrics\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785076 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-sa-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785107 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7853aab4-c5b8-400c-9b11-80102982ddd3-tmp\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785172 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56r4x\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-kube-api-access-56r4x\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785228 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-entrypoint\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785255 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/7853aab4-c5b8-400c-9b11-80102982ddd3-datadir\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785287 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785322 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-syslog-receiver\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785346 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785368 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-trusted-ca\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.785587 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config-openshift-service-cacrt\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.786081 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/7853aab4-c5b8-400c-9b11-80102982ddd3-datadir\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.788828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.790794 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-entrypoint\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.790806 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-trusted-ca\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.791857 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/7853aab4-c5b8-400c-9b11-80102982ddd3-config-openshift-service-cacrt\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.794006 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-syslog-receiver\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.794162 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-collector-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.795476 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/7853aab4-c5b8-400c-9b11-80102982ddd3-metrics\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.795700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7853aab4-c5b8-400c-9b11-80102982ddd3-tmp\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.805502 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-sa-token\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.808958 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56r4x\" (UniqueName: \"kubernetes.io/projected/7853aab4-c5b8-400c-9b11-80102982ddd3-kube-api-access-56r4x\") pod \"collector-zx7v8\" (UID: \"7853aab4-c5b8-400c-9b11-80102982ddd3\") " pod="openshift-logging/collector-zx7v8" Mar 18 13:18:24 crc kubenswrapper[4912]: I0318 13:18:24.887955 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zx7v8" Mar 18 13:18:25 crc kubenswrapper[4912]: I0318 13:18:25.328184 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zx7v8"] Mar 18 13:18:25 crc kubenswrapper[4912]: W0318 13:18:25.336510 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7853aab4_c5b8_400c_9b11_80102982ddd3.slice/crio-221acacee7dcf7d7f9bd9496749e7b32946380f5bcea3a83f46ecab8593e8c2e WatchSource:0}: Error finding container 221acacee7dcf7d7f9bd9496749e7b32946380f5bcea3a83f46ecab8593e8c2e: Status 404 returned error can't find the container with id 221acacee7dcf7d7f9bd9496749e7b32946380f5bcea3a83f46ecab8593e8c2e Mar 18 13:18:25 crc kubenswrapper[4912]: I0318 13:18:25.496823 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zx7v8" event={"ID":"7853aab4-c5b8-400c-9b11-80102982ddd3","Type":"ContainerStarted","Data":"221acacee7dcf7d7f9bd9496749e7b32946380f5bcea3a83f46ecab8593e8c2e"} Mar 18 13:18:26 crc kubenswrapper[4912]: I0318 13:18:26.244965 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66713b6f-a8f6-4118-b6f4-19012e591e7a" path="/var/lib/kubelet/pods/66713b6f-a8f6-4118-b6f4-19012e591e7a/volumes" Mar 18 13:18:31 crc kubenswrapper[4912]: I0318 13:18:31.568556 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zx7v8" event={"ID":"7853aab4-c5b8-400c-9b11-80102982ddd3","Type":"ContainerStarted","Data":"7c404e5269b583fc04cdaf7df296932ff870fa09b321f7f3adbdf1457dc60eb7"} Mar 18 13:18:31 crc kubenswrapper[4912]: I0318 13:18:31.608120 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-zx7v8" podStartSLOduration=2.164295223 podStartE2EDuration="7.608088497s" podCreationTimestamp="2026-03-18 13:18:24 +0000 UTC" firstStartedPulling="2026-03-18 13:18:25.340266715 +0000 UTC m=+953.799694140" lastFinishedPulling="2026-03-18 13:18:30.784059989 +0000 UTC m=+959.243487414" observedRunningTime="2026-03-18 13:18:31.600423458 +0000 UTC m=+960.059850903" watchObservedRunningTime="2026-03-18 13:18:31.608088497 +0000 UTC m=+960.067515932" Mar 18 13:18:33 crc kubenswrapper[4912]: I0318 13:18:33.862203 4912 scope.go:117] "RemoveContainer" containerID="d0ee852435c27a5a41a5a94aa89dc09cb28a0c542d8334c39a095800bf8c74e7" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.637017 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm"] Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.639667 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.642447 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.663788 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm"] Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.765481 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.765605 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4dw\" (UniqueName: \"kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.765724 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.867337 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.867775 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4dw\" (UniqueName: \"kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.867917 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.868088 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.868382 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.890014 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4dw\" (UniqueName: \"kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:04 crc kubenswrapper[4912]: I0318 13:19:04.963938 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:05 crc kubenswrapper[4912]: I0318 13:19:05.648285 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm"] Mar 18 13:19:05 crc kubenswrapper[4912]: I0318 13:19:05.937205 4912 generic.go:334] "Generic (PLEG): container finished" podID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerID="b817c91c072fe5c2ac00bf6850aa2bffc38ab0e4165cd72e0c43ca035fadfefb" exitCode=0 Mar 18 13:19:05 crc kubenswrapper[4912]: I0318 13:19:05.937291 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" event={"ID":"3252c88f-2452-4b5a-9ebc-4410c8c9f822","Type":"ContainerDied","Data":"b817c91c072fe5c2ac00bf6850aa2bffc38ab0e4165cd72e0c43ca035fadfefb"} Mar 18 13:19:05 crc kubenswrapper[4912]: I0318 13:19:05.937372 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" event={"ID":"3252c88f-2452-4b5a-9ebc-4410c8c9f822","Type":"ContainerStarted","Data":"fab4cc999150079d82c3ad09299b64753e4d991e9d2c90b0ece1141b67cc10c5"} Mar 18 13:19:07 crc kubenswrapper[4912]: I0318 13:19:07.954893 4912 generic.go:334] "Generic (PLEG): container finished" podID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerID="ef0825e634a25bb409e2401b0edf53811758acbdd601392c7cd9341e9e33126f" exitCode=0 Mar 18 13:19:07 crc kubenswrapper[4912]: I0318 13:19:07.955024 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" event={"ID":"3252c88f-2452-4b5a-9ebc-4410c8c9f822","Type":"ContainerDied","Data":"ef0825e634a25bb409e2401b0edf53811758acbdd601392c7cd9341e9e33126f"} Mar 18 13:19:08 crc kubenswrapper[4912]: I0318 13:19:08.968398 4912 generic.go:334] "Generic (PLEG): container finished" podID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerID="52d60e835935b3e038c12c6d6740d49a69caded8d7c289f374a01d2e0d360b97" exitCode=0 Mar 18 13:19:08 crc kubenswrapper[4912]: I0318 13:19:08.968463 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" event={"ID":"3252c88f-2452-4b5a-9ebc-4410c8c9f822","Type":"ContainerDied","Data":"52d60e835935b3e038c12c6d6740d49a69caded8d7c289f374a01d2e0d360b97"} Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.326206 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.475946 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util\") pod \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.476005 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle\") pod \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.476256 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh4dw\" (UniqueName: \"kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw\") pod \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\" (UID: \"3252c88f-2452-4b5a-9ebc-4410c8c9f822\") " Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.477028 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle" (OuterVolumeSpecName: "bundle") pod "3252c88f-2452-4b5a-9ebc-4410c8c9f822" (UID: "3252c88f-2452-4b5a-9ebc-4410c8c9f822"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.483811 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw" (OuterVolumeSpecName: "kube-api-access-wh4dw") pod "3252c88f-2452-4b5a-9ebc-4410c8c9f822" (UID: "3252c88f-2452-4b5a-9ebc-4410c8c9f822"). InnerVolumeSpecName "kube-api-access-wh4dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.489287 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util" (OuterVolumeSpecName: "util") pod "3252c88f-2452-4b5a-9ebc-4410c8c9f822" (UID: "3252c88f-2452-4b5a-9ebc-4410c8c9f822"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.578102 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh4dw\" (UniqueName: \"kubernetes.io/projected/3252c88f-2452-4b5a-9ebc-4410c8c9f822-kube-api-access-wh4dw\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.578157 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.578169 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3252c88f-2452-4b5a-9ebc-4410c8c9f822-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.985080 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" event={"ID":"3252c88f-2452-4b5a-9ebc-4410c8c9f822","Type":"ContainerDied","Data":"fab4cc999150079d82c3ad09299b64753e4d991e9d2c90b0ece1141b67cc10c5"} Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.985126 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab4cc999150079d82c3ad09299b64753e4d991e9d2c90b0ece1141b67cc10c5" Mar 18 13:19:10 crc kubenswrapper[4912]: I0318 13:19:10.985205 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.993419 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9"] Mar 18 13:19:13 crc kubenswrapper[4912]: E0318 13:19:13.994306 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="util" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.994325 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="util" Mar 18 13:19:13 crc kubenswrapper[4912]: E0318 13:19:13.994343 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="pull" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.994351 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="pull" Mar 18 13:19:13 crc kubenswrapper[4912]: E0318 13:19:13.994364 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="extract" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.994375 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="extract" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.994542 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3252c88f-2452-4b5a-9ebc-4410c8c9f822" containerName="extract" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.995310 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.997899 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 13:19:13 crc kubenswrapper[4912]: I0318 13:19:13.997909 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9px9r" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.007359 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.010915 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9"] Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.161528 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzdt\" (UniqueName: \"kubernetes.io/projected/5343c7cb-068f-489d-be0f-a09ea457e71f-kube-api-access-8qzdt\") pod \"nmstate-operator-796d4cfff4-2ppb9\" (UID: \"5343c7cb-068f-489d-be0f-a09ea457e71f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.263986 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzdt\" (UniqueName: \"kubernetes.io/projected/5343c7cb-068f-489d-be0f-a09ea457e71f-kube-api-access-8qzdt\") pod \"nmstate-operator-796d4cfff4-2ppb9\" (UID: \"5343c7cb-068f-489d-be0f-a09ea457e71f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.287130 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzdt\" (UniqueName: \"kubernetes.io/projected/5343c7cb-068f-489d-be0f-a09ea457e71f-kube-api-access-8qzdt\") pod \"nmstate-operator-796d4cfff4-2ppb9\" (UID: \"5343c7cb-068f-489d-be0f-a09ea457e71f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.321935 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" Mar 18 13:19:14 crc kubenswrapper[4912]: I0318 13:19:14.756295 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9"] Mar 18 13:19:15 crc kubenswrapper[4912]: I0318 13:19:15.017986 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" event={"ID":"5343c7cb-068f-489d-be0f-a09ea457e71f","Type":"ContainerStarted","Data":"90783ef7c9af1ef3acecc9d7141b5b64464e71aa6ae5c3ecaf8ae96f82a28178"} Mar 18 13:19:18 crc kubenswrapper[4912]: I0318 13:19:18.049965 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" event={"ID":"5343c7cb-068f-489d-be0f-a09ea457e71f","Type":"ContainerStarted","Data":"5d19cfa72c2337cf3a528bbae994ab9f1ca7efa3086389db9c987a5ef26f3343"} Mar 18 13:19:18 crc kubenswrapper[4912]: I0318 13:19:18.072876 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2ppb9" podStartSLOduration=2.764556147 podStartE2EDuration="5.072855876s" podCreationTimestamp="2026-03-18 13:19:13 +0000 UTC" firstStartedPulling="2026-03-18 13:19:14.769263937 +0000 UTC m=+1003.228691362" lastFinishedPulling="2026-03-18 13:19:17.077563666 +0000 UTC m=+1005.536991091" observedRunningTime="2026-03-18 13:19:18.068540759 +0000 UTC m=+1006.527968184" watchObservedRunningTime="2026-03-18 13:19:18.072855876 +0000 UTC m=+1006.532283301" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.121225 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-775r6"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.123255 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.133872 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-j48bv" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.145985 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-775r6"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.180053 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.181706 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.187921 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.222122 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zszmc"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.223708 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.239251 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.265383 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v9q4\" (UniqueName: \"kubernetes.io/projected/e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf-kube-api-access-5v9q4\") pod \"nmstate-metrics-9b8c8685d-775r6\" (UID: \"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.265499 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7bm\" (UniqueName: \"kubernetes.io/projected/3e51cc8b-d69c-4be9-8b12-c1a10c653621-kube-api-access-8r7bm\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.265578 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.367827 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v9q4\" (UniqueName: \"kubernetes.io/projected/e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf-kube-api-access-5v9q4\") pod \"nmstate-metrics-9b8c8685d-775r6\" (UID: \"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.367909 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk7cg\" (UniqueName: \"kubernetes.io/projected/23eaceef-5a11-4610-91b0-6ca3c42c167f-kube-api-access-rk7cg\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.367951 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-dbus-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.368308 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7bm\" (UniqueName: \"kubernetes.io/projected/3e51cc8b-d69c-4be9-8b12-c1a10c653621-kube-api-access-8r7bm\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.368466 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-nmstate-lock\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.368496 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-ovs-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.368717 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: E0318 13:19:19.369140 4912 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 13:19:19 crc kubenswrapper[4912]: E0318 13:19:19.369201 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair podName:3e51cc8b-d69c-4be9-8b12-c1a10c653621 nodeName:}" failed. No retries permitted until 2026-03-18 13:19:19.869181911 +0000 UTC m=+1008.328609336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair") pod "nmstate-webhook-5f558f5558-jkd5w" (UID: "3e51cc8b-d69c-4be9-8b12-c1a10c653621") : secret "openshift-nmstate-webhook" not found Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.406287 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.409177 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.418309 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.418566 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-v4nr5" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.418695 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.420123 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7bm\" (UniqueName: \"kubernetes.io/projected/3e51cc8b-d69c-4be9-8b12-c1a10c653621-kube-api-access-8r7bm\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.424962 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v9q4\" (UniqueName: \"kubernetes.io/projected/e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf-kube-api-access-5v9q4\") pod \"nmstate-metrics-9b8c8685d-775r6\" (UID: \"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.437738 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.458488 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.471318 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk7cg\" (UniqueName: \"kubernetes.io/projected/23eaceef-5a11-4610-91b0-6ca3c42c167f-kube-api-access-rk7cg\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.471382 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-dbus-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.471461 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-nmstate-lock\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.471479 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-ovs-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.475634 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-nmstate-lock\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.475696 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-ovs-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.527902 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/23eaceef-5a11-4610-91b0-6ca3c42c167f-dbus-socket\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.556674 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk7cg\" (UniqueName: \"kubernetes.io/projected/23eaceef-5a11-4610-91b0-6ca3c42c167f-kube-api-access-rk7cg\") pod \"nmstate-handler-zszmc\" (UID: \"23eaceef-5a11-4610-91b0-6ca3c42c167f\") " pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.599888 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7f16942-c64f-46a0-84ad-a52844af0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.600018 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hsg\" (UniqueName: \"kubernetes.io/projected/a7f16942-c64f-46a0-84ad-a52844af0d08-kube-api-access-f9hsg\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.600094 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f16942-c64f-46a0-84ad-a52844af0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.663826 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.667642 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.691779 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.701928 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hsg\" (UniqueName: \"kubernetes.io/projected/a7f16942-c64f-46a0-84ad-a52844af0d08-kube-api-access-f9hsg\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.702001 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f16942-c64f-46a0-84ad-a52844af0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.702093 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7f16942-c64f-46a0-84ad-a52844af0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.703636 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7f16942-c64f-46a0-84ad-a52844af0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.709859 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f16942-c64f-46a0-84ad-a52844af0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.748990 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hsg\" (UniqueName: \"kubernetes.io/projected/a7f16942-c64f-46a0-84ad-a52844af0d08-kube-api-access-f9hsg\") pod \"nmstate-console-plugin-86f58fcf4-v7l52\" (UID: \"a7f16942-c64f-46a0-84ad-a52844af0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803447 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803505 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803560 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803635 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqfqq\" (UniqueName: \"kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803657 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803728 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.803759 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.843549 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:19 crc kubenswrapper[4912]: W0318 13:19:19.894456 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23eaceef_5a11_4610_91b0_6ca3c42c167f.slice/crio-d4285a81856802fda4806f5076e51ba63a8eeb0f7ded871950b2c5d5be8ab178 WatchSource:0}: Error finding container d4285a81856802fda4806f5076e51ba63a8eeb0f7ded871950b2c5d5be8ab178: Status 404 returned error can't find the container with id d4285a81856802fda4806f5076e51ba63a8eeb0f7ded871950b2c5d5be8ab178 Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.902434 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905311 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqfqq\" (UniqueName: \"kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905363 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905471 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905509 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905571 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905604 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905635 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.905672 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.907504 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.909631 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.910327 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.911462 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.911485 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.911819 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e51cc8b-d69c-4be9-8b12-c1a10c653621-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jkd5w\" (UID: \"3e51cc8b-d69c-4be9-8b12-c1a10c653621\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.914547 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.924257 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqfqq\" (UniqueName: \"kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq\") pod \"console-fccc4d7b-dngkq\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:19 crc kubenswrapper[4912]: I0318 13:19:19.998261 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.080667 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zszmc" event={"ID":"23eaceef-5a11-4610-91b0-6ca3c42c167f","Type":"ContainerStarted","Data":"d4285a81856802fda4806f5076e51ba63a8eeb0f7ded871950b2c5d5be8ab178"} Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.098339 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-775r6"] Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.102851 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:20 crc kubenswrapper[4912]: W0318 13:19:20.106382 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3aa2cf0_8cc6_4c4f_b192_c3bf7113f7bf.slice/crio-ee9ca4041da0a1a8399d84b1c2f881d828fbd906568281ad1b7314be47a83ba3 WatchSource:0}: Error finding container ee9ca4041da0a1a8399d84b1c2f881d828fbd906568281ad1b7314be47a83ba3: Status 404 returned error can't find the container with id ee9ca4041da0a1a8399d84b1c2f881d828fbd906568281ad1b7314be47a83ba3 Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.326623 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.382520 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52"] Mar 18 13:19:20 crc kubenswrapper[4912]: W0318 13:19:20.415703 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f16942_c64f_46a0_84ad_a52844af0d08.slice/crio-c2002ffad579995fae032f853526a0b77133139efc1af1b0e9eeb9bc1525d332 WatchSource:0}: Error finding container c2002ffad579995fae032f853526a0b77133139efc1af1b0e9eeb9bc1525d332: Status 404 returned error can't find the container with id c2002ffad579995fae032f853526a0b77133139efc1af1b0e9eeb9bc1525d332 Mar 18 13:19:20 crc kubenswrapper[4912]: I0318 13:19:20.440267 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w"] Mar 18 13:19:20 crc kubenswrapper[4912]: W0318 13:19:20.455556 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e51cc8b_d69c_4be9_8b12_c1a10c653621.slice/crio-d056e1392057173a9c3ed55fb28f65224e7119ccb77aafc4ae714c7a39ede189 WatchSource:0}: Error finding container d056e1392057173a9c3ed55fb28f65224e7119ccb77aafc4ae714c7a39ede189: Status 404 returned error can't find the container with id d056e1392057173a9c3ed55fb28f65224e7119ccb77aafc4ae714c7a39ede189 Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.091815 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" event={"ID":"3e51cc8b-d69c-4be9-8b12-c1a10c653621","Type":"ContainerStarted","Data":"d056e1392057173a9c3ed55fb28f65224e7119ccb77aafc4ae714c7a39ede189"} Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.094467 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" event={"ID":"a7f16942-c64f-46a0-84ad-a52844af0d08","Type":"ContainerStarted","Data":"c2002ffad579995fae032f853526a0b77133139efc1af1b0e9eeb9bc1525d332"} Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.095893 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" event={"ID":"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf","Type":"ContainerStarted","Data":"ee9ca4041da0a1a8399d84b1c2f881d828fbd906568281ad1b7314be47a83ba3"} Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.098480 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fccc4d7b-dngkq" event={"ID":"ea48ad05-2840-485b-9aef-8477c33cf61b","Type":"ContainerStarted","Data":"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188"} Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.098524 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fccc4d7b-dngkq" event={"ID":"ea48ad05-2840-485b-9aef-8477c33cf61b","Type":"ContainerStarted","Data":"586fda836b9031546f2e3eef89f09de12836e1b4ccc92bc2417aaa1edf2f05f8"} Mar 18 13:19:21 crc kubenswrapper[4912]: I0318 13:19:21.125147 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fccc4d7b-dngkq" podStartSLOduration=2.125124852 podStartE2EDuration="2.125124852s" podCreationTimestamp="2026-03-18 13:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:19:21.121469683 +0000 UTC m=+1009.580897118" watchObservedRunningTime="2026-03-18 13:19:21.125124852 +0000 UTC m=+1009.584552277" Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.134416 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" event={"ID":"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf","Type":"ContainerStarted","Data":"5c8840e570f85eed9af8ecbf33c0c67bac47c4d80b7e3e731cc22d22c173300e"} Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.136764 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" event={"ID":"3e51cc8b-d69c-4be9-8b12-c1a10c653621","Type":"ContainerStarted","Data":"7f8a5a4a420977a35d147ab00e68ce07aa87b07cf3ade41354d1dc9e64f99f19"} Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.136928 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.142626 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" event={"ID":"a7f16942-c64f-46a0-84ad-a52844af0d08","Type":"ContainerStarted","Data":"02672e1f9ac22270f58eb38e102ecb79bce1c2b4bd31343f47c4b36f5318ca13"} Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.144958 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zszmc" event={"ID":"23eaceef-5a11-4610-91b0-6ca3c42c167f","Type":"ContainerStarted","Data":"7b1bd526d6b1a55a87cc9c3f24f1ca241dad35e91a4d91bb330cecb0b5fc35ba"} Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.145106 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.161755 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podStartSLOduration=2.10991699 podStartE2EDuration="5.161734074s" podCreationTimestamp="2026-03-18 13:19:19 +0000 UTC" firstStartedPulling="2026-03-18 13:19:20.461084639 +0000 UTC m=+1008.920512064" lastFinishedPulling="2026-03-18 13:19:23.512901723 +0000 UTC m=+1011.972329148" observedRunningTime="2026-03-18 13:19:24.158166097 +0000 UTC m=+1012.617593532" watchObservedRunningTime="2026-03-18 13:19:24.161734074 +0000 UTC m=+1012.621161489" Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.193440 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zszmc" podStartSLOduration=1.578004468 podStartE2EDuration="5.193401755s" podCreationTimestamp="2026-03-18 13:19:19 +0000 UTC" firstStartedPulling="2026-03-18 13:19:19.89655538 +0000 UTC m=+1008.355982805" lastFinishedPulling="2026-03-18 13:19:23.511952667 +0000 UTC m=+1011.971380092" observedRunningTime="2026-03-18 13:19:24.187273698 +0000 UTC m=+1012.646701123" watchObservedRunningTime="2026-03-18 13:19:24.193401755 +0000 UTC m=+1012.652829180" Mar 18 13:19:24 crc kubenswrapper[4912]: I0318 13:19:24.229599 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-v7l52" podStartSLOduration=2.135248368 podStartE2EDuration="5.229571868s" podCreationTimestamp="2026-03-18 13:19:19 +0000 UTC" firstStartedPulling="2026-03-18 13:19:20.415826358 +0000 UTC m=+1008.875253783" lastFinishedPulling="2026-03-18 13:19:23.510149858 +0000 UTC m=+1011.969577283" observedRunningTime="2026-03-18 13:19:24.217441709 +0000 UTC m=+1012.676869144" watchObservedRunningTime="2026-03-18 13:19:24.229571868 +0000 UTC m=+1012.688999293" Mar 18 13:19:27 crc kubenswrapper[4912]: I0318 13:19:27.179527 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" event={"ID":"e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf","Type":"ContainerStarted","Data":"3428cf430ef5736d94231ab7f01305b00a824fbf3a687181e3f5030003500c40"} Mar 18 13:19:27 crc kubenswrapper[4912]: I0318 13:19:27.224995 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-775r6" podStartSLOduration=1.827966924 podStartE2EDuration="8.224964418s" podCreationTimestamp="2026-03-18 13:19:19 +0000 UTC" firstStartedPulling="2026-03-18 13:19:20.117292342 +0000 UTC m=+1008.576719777" lastFinishedPulling="2026-03-18 13:19:26.514289846 +0000 UTC m=+1014.973717271" observedRunningTime="2026-03-18 13:19:27.20809483 +0000 UTC m=+1015.667522305" watchObservedRunningTime="2026-03-18 13:19:27.224964418 +0000 UTC m=+1015.684391853" Mar 18 13:19:29 crc kubenswrapper[4912]: I0318 13:19:29.877704 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 13:19:30 crc kubenswrapper[4912]: I0318 13:19:30.000135 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:30 crc kubenswrapper[4912]: I0318 13:19:30.000208 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:30 crc kubenswrapper[4912]: I0318 13:19:30.009566 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:30 crc kubenswrapper[4912]: I0318 13:19:30.781817 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:19:30 crc kubenswrapper[4912]: I0318 13:19:30.855250 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:19:40 crc kubenswrapper[4912]: I0318 13:19:40.110008 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 13:19:55 crc kubenswrapper[4912]: I0318 13:19:55.909131 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-647cc7864c-595s8" podUID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" containerName="console" containerID="cri-o://351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451" gracePeriod=15 Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.456478 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-647cc7864c-595s8_5b9f3f7e-c6a1-4de4-9057-5afd584c45f6/console/0.log" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.456898 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555074 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555215 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfxr8\" (UniqueName: \"kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555422 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555455 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555492 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555561 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.555618 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert\") pod \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\" (UID: \"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6\") " Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.556147 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.556189 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config" (OuterVolumeSpecName: "console-config") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.556286 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.558840 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.565349 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.582893 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.587308 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8" (OuterVolumeSpecName: "kube-api-access-qfxr8") pod "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" (UID: "5b9f3f7e-c6a1-4de4-9057-5afd584c45f6"). InnerVolumeSpecName "kube-api-access-qfxr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658463 4912 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658500 4912 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658512 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658522 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfxr8\" (UniqueName: \"kubernetes.io/projected/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-kube-api-access-qfxr8\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658534 4912 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658542 4912 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:56 crc kubenswrapper[4912]: I0318 13:19:56.658552 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002293 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-647cc7864c-595s8_5b9f3f7e-c6a1-4de4-9057-5afd584c45f6/console/0.log" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002371 4912 generic.go:334] "Generic (PLEG): container finished" podID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" containerID="351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451" exitCode=2 Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002415 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-647cc7864c-595s8" event={"ID":"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6","Type":"ContainerDied","Data":"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451"} Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002440 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-647cc7864c-595s8" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002457 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-647cc7864c-595s8" event={"ID":"5b9f3f7e-c6a1-4de4-9057-5afd584c45f6","Type":"ContainerDied","Data":"867e4f1aa443970b3cf77bf62f5f119404bf1af5fcab9a79e0adfb15e74e6348"} Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.002483 4912 scope.go:117] "RemoveContainer" containerID="351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.030223 4912 scope.go:117] "RemoveContainer" containerID="351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451" Mar 18 13:19:57 crc kubenswrapper[4912]: E0318 13:19:57.030874 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451\": container with ID starting with 351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451 not found: ID does not exist" containerID="351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.030906 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451"} err="failed to get container status \"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451\": rpc error: code = NotFound desc = could not find container \"351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451\": container with ID starting with 351dd0f273c1fd95f2ded6432cc59cfaa384a81fb14ac26a35138dca95469451 not found: ID does not exist" Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.040872 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:19:57 crc kubenswrapper[4912]: I0318 13:19:57.051059 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-647cc7864c-595s8"] Mar 18 13:19:58 crc kubenswrapper[4912]: I0318 13:19:58.238601 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" path="/var/lib/kubelet/pods/5b9f3f7e-c6a1-4de4-9057-5afd584c45f6/volumes" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.020845 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c"] Mar 18 13:19:59 crc kubenswrapper[4912]: E0318 13:19:59.021658 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" containerName="console" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.021802 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" containerName="console" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.022100 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9f3f7e-c6a1-4de4-9057-5afd584c45f6" containerName="console" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.023395 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.026087 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.031724 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c"] Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.132031 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.132562 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.132611 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2x6\" (UniqueName: \"kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.233818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.233986 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.234087 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2x6\" (UniqueName: \"kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.234876 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.235007 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.257079 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2x6\" (UniqueName: \"kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.342181 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:19:59 crc kubenswrapper[4912]: I0318 13:19:59.850357 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c"] Mar 18 13:19:59 crc kubenswrapper[4912]: W0318 13:19:59.858493 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7017e2_48e0_4868_b763_4186e771faae.slice/crio-cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a WatchSource:0}: Error finding container cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a: Status 404 returned error can't find the container with id cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.032747 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerStarted","Data":"daab27f321e7146ed72dd52b7ad958336b750005527ff41e77072dafb6605ff9"} Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.032836 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerStarted","Data":"cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a"} Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.144692 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564000-sn5lp"] Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.145802 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.148793 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.149114 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.149355 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.170490 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-sn5lp"] Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.253055 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jmr9\" (UniqueName: \"kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9\") pod \"auto-csr-approver-29564000-sn5lp\" (UID: \"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9\") " pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.355309 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jmr9\" (UniqueName: \"kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9\") pod \"auto-csr-approver-29564000-sn5lp\" (UID: \"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9\") " pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.377373 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jmr9\" (UniqueName: \"kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9\") pod \"auto-csr-approver-29564000-sn5lp\" (UID: \"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9\") " pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:00 crc kubenswrapper[4912]: I0318 13:20:00.463972 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:01 crc kubenswrapper[4912]: I0318 13:20:01.045616 4912 generic.go:334] "Generic (PLEG): container finished" podID="9d7017e2-48e0-4868-b763-4186e771faae" containerID="daab27f321e7146ed72dd52b7ad958336b750005527ff41e77072dafb6605ff9" exitCode=0 Mar 18 13:20:01 crc kubenswrapper[4912]: I0318 13:20:01.045669 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerDied","Data":"daab27f321e7146ed72dd52b7ad958336b750005527ff41e77072dafb6605ff9"} Mar 18 13:20:01 crc kubenswrapper[4912]: I0318 13:20:01.055324 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-sn5lp"] Mar 18 13:20:01 crc kubenswrapper[4912]: W0318 13:20:01.061575 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode59fbd6e_ada8_4bbc_bfcc_e80a464664f9.slice/crio-68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d WatchSource:0}: Error finding container 68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d: Status 404 returned error can't find the container with id 68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d Mar 18 13:20:02 crc kubenswrapper[4912]: I0318 13:20:02.055363 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" event={"ID":"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9","Type":"ContainerStarted","Data":"68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d"} Mar 18 13:20:03 crc kubenswrapper[4912]: I0318 13:20:03.065461 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" event={"ID":"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9","Type":"ContainerStarted","Data":"2f750434743d18b9fb3630441917dd76a707a124f6254124e0111e1f9bb0f3e6"} Mar 18 13:20:03 crc kubenswrapper[4912]: I0318 13:20:03.093278 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" podStartSLOduration=1.798927602 podStartE2EDuration="3.093253543s" podCreationTimestamp="2026-03-18 13:20:00 +0000 UTC" firstStartedPulling="2026-03-18 13:20:01.070767014 +0000 UTC m=+1049.530194439" lastFinishedPulling="2026-03-18 13:20:02.365092955 +0000 UTC m=+1050.824520380" observedRunningTime="2026-03-18 13:20:03.085472551 +0000 UTC m=+1051.544899986" watchObservedRunningTime="2026-03-18 13:20:03.093253543 +0000 UTC m=+1051.552680968" Mar 18 13:20:04 crc kubenswrapper[4912]: I0318 13:20:04.074534 4912 generic.go:334] "Generic (PLEG): container finished" podID="9d7017e2-48e0-4868-b763-4186e771faae" containerID="5c8e99c8d768f424a88c207aba10c474a8f1e2b8920cfdc3affba17aba159a07" exitCode=0 Mar 18 13:20:04 crc kubenswrapper[4912]: I0318 13:20:04.074624 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerDied","Data":"5c8e99c8d768f424a88c207aba10c474a8f1e2b8920cfdc3affba17aba159a07"} Mar 18 13:20:04 crc kubenswrapper[4912]: I0318 13:20:04.077329 4912 generic.go:334] "Generic (PLEG): container finished" podID="e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" containerID="2f750434743d18b9fb3630441917dd76a707a124f6254124e0111e1f9bb0f3e6" exitCode=0 Mar 18 13:20:04 crc kubenswrapper[4912]: I0318 13:20:04.077360 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" event={"ID":"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9","Type":"ContainerDied","Data":"2f750434743d18b9fb3630441917dd76a707a124f6254124e0111e1f9bb0f3e6"} Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.089587 4912 generic.go:334] "Generic (PLEG): container finished" podID="9d7017e2-48e0-4868-b763-4186e771faae" containerID="30ca32564f50e096db1b0dea0ce9ca1946807cf438ce64839bf9fd373d9a9eee" exitCode=0 Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.089668 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerDied","Data":"30ca32564f50e096db1b0dea0ce9ca1946807cf438ce64839bf9fd373d9a9eee"} Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.561483 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.669378 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jmr9\" (UniqueName: \"kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9\") pod \"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9\" (UID: \"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9\") " Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.676748 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9" (OuterVolumeSpecName: "kube-api-access-5jmr9") pod "e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" (UID: "e59fbd6e-ada8-4bbc-bfcc-e80a464664f9"). InnerVolumeSpecName "kube-api-access-5jmr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:20:05 crc kubenswrapper[4912]: I0318 13:20:05.771553 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jmr9\" (UniqueName: \"kubernetes.io/projected/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9-kube-api-access-5jmr9\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.100563 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.102291 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-sn5lp" event={"ID":"e59fbd6e-ada8-4bbc-bfcc-e80a464664f9","Type":"ContainerDied","Data":"68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d"} Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.102630 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68118ef22baf5a81852abb0d5fb5e62d11741d03f88104fde5c3f1d2eae1c25d" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.434787 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.587989 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc2x6\" (UniqueName: \"kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6\") pod \"9d7017e2-48e0-4868-b763-4186e771faae\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.589611 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle\") pod \"9d7017e2-48e0-4868-b763-4186e771faae\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.591393 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util\") pod \"9d7017e2-48e0-4868-b763-4186e771faae\" (UID: \"9d7017e2-48e0-4868-b763-4186e771faae\") " Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.591418 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle" (OuterVolumeSpecName: "bundle") pod "9d7017e2-48e0-4868-b763-4186e771faae" (UID: "9d7017e2-48e0-4868-b763-4186e771faae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.593252 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.593480 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6" (OuterVolumeSpecName: "kube-api-access-lc2x6") pod "9d7017e2-48e0-4868-b763-4186e771faae" (UID: "9d7017e2-48e0-4868-b763-4186e771faae"). InnerVolumeSpecName "kube-api-access-lc2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.605806 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util" (OuterVolumeSpecName: "util") pod "9d7017e2-48e0-4868-b763-4186e771faae" (UID: "9d7017e2-48e0-4868-b763-4186e771faae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.631130 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-8tp7p"] Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.640896 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-8tp7p"] Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.695153 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7017e2-48e0-4868-b763-4186e771faae-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:06 crc kubenswrapper[4912]: I0318 13:20:06.695195 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc2x6\" (UniqueName: \"kubernetes.io/projected/9d7017e2-48e0-4868-b763-4186e771faae-kube-api-access-lc2x6\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:07 crc kubenswrapper[4912]: I0318 13:20:07.111168 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" event={"ID":"9d7017e2-48e0-4868-b763-4186e771faae","Type":"ContainerDied","Data":"cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a"} Mar 18 13:20:07 crc kubenswrapper[4912]: I0318 13:20:07.111226 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf94f6a340285330a49be432dbd0e8823723046b9449ef45af52a4fa4c05126a" Mar 18 13:20:07 crc kubenswrapper[4912]: I0318 13:20:07.111304 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c" Mar 18 13:20:08 crc kubenswrapper[4912]: I0318 13:20:08.241619 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17cc9abc-8326-4767-ba0f-6efee8163b1f" path="/var/lib/kubelet/pods/17cc9abc-8326-4767-ba0f-6efee8163b1f/volumes" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.722013 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-666765756d-v7mtx"] Mar 18 13:20:17 crc kubenswrapper[4912]: E0318 13:20:17.723318 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" containerName="oc" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723334 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" containerName="oc" Mar 18 13:20:17 crc kubenswrapper[4912]: E0318 13:20:17.723347 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="util" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723353 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="util" Mar 18 13:20:17 crc kubenswrapper[4912]: E0318 13:20:17.723376 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="extract" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723384 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="extract" Mar 18 13:20:17 crc kubenswrapper[4912]: E0318 13:20:17.723402 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="pull" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723408 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="pull" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723553 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7017e2-48e0-4868-b763-4186e771faae" containerName="extract" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.723568 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" containerName="oc" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.724208 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.730002 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.730050 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.731374 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.731592 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tqwsr" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.741469 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.763299 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-666765756d-v7mtx"] Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.804840 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-apiservice-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.804956 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-webhook-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.805051 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdr7z\" (UniqueName: \"kubernetes.io/projected/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-kube-api-access-qdr7z\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.907178 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-webhook-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.907267 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdr7z\" (UniqueName: \"kubernetes.io/projected/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-kube-api-access-qdr7z\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.907340 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-apiservice-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.915631 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-apiservice-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.929897 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-webhook-cert\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:17 crc kubenswrapper[4912]: I0318 13:20:17.932093 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdr7z\" (UniqueName: \"kubernetes.io/projected/9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4-kube-api-access-qdr7z\") pod \"metallb-operator-controller-manager-666765756d-v7mtx\" (UID: \"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4\") " pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.043803 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.145942 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5"] Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.147052 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.149896 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.150213 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9vjqp" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.150333 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.174017 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5"] Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.212135 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2mm\" (UniqueName: \"kubernetes.io/projected/9ead324e-7891-4059-9d70-90462b2cc852-kube-api-access-hx2mm\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.212430 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-webhook-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.212716 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-apiservice-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.322878 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2mm\" (UniqueName: \"kubernetes.io/projected/9ead324e-7891-4059-9d70-90462b2cc852-kube-api-access-hx2mm\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.323550 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-webhook-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.323780 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-apiservice-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.341193 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-webhook-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.363120 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ead324e-7891-4059-9d70-90462b2cc852-apiservice-cert\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.370628 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2mm\" (UniqueName: \"kubernetes.io/projected/9ead324e-7891-4059-9d70-90462b2cc852-kube-api-access-hx2mm\") pod \"metallb-operator-webhook-server-54bbf46695-l6jq5\" (UID: \"9ead324e-7891-4059-9d70-90462b2cc852\") " pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.502498 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:18 crc kubenswrapper[4912]: I0318 13:20:18.687503 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-666765756d-v7mtx"] Mar 18 13:20:19 crc kubenswrapper[4912]: I0318 13:20:19.014921 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5"] Mar 18 13:20:19 crc kubenswrapper[4912]: I0318 13:20:19.212772 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" event={"ID":"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4","Type":"ContainerStarted","Data":"6603adcb7b697db95c63d81b425806a688afa6b283f7288a533508dd6033c476"} Mar 18 13:20:19 crc kubenswrapper[4912]: I0318 13:20:19.215285 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" event={"ID":"9ead324e-7891-4059-9d70-90462b2cc852","Type":"ContainerStarted","Data":"ad695a755f7369bc6f6bc8fd510cb986120a1cf1f1d5526b71ecf128bb0dfefe"} Mar 18 13:20:23 crc kubenswrapper[4912]: I0318 13:20:23.275632 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" event={"ID":"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4","Type":"ContainerStarted","Data":"563b7fc7c9639fb508ff660ec01f7ba421e215a122cde8fd4816b239b7c06b34"} Mar 18 13:20:23 crc kubenswrapper[4912]: I0318 13:20:23.276507 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:23 crc kubenswrapper[4912]: I0318 13:20:23.312853 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" podStartSLOduration=2.74138258 podStartE2EDuration="6.312818622s" podCreationTimestamp="2026-03-18 13:20:17 +0000 UTC" firstStartedPulling="2026-03-18 13:20:18.697005474 +0000 UTC m=+1067.156432899" lastFinishedPulling="2026-03-18 13:20:22.268441516 +0000 UTC m=+1070.727868941" observedRunningTime="2026-03-18 13:20:23.305847092 +0000 UTC m=+1071.765274537" watchObservedRunningTime="2026-03-18 13:20:23.312818622 +0000 UTC m=+1071.772246047" Mar 18 13:20:25 crc kubenswrapper[4912]: I0318 13:20:25.292400 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" event={"ID":"9ead324e-7891-4059-9d70-90462b2cc852","Type":"ContainerStarted","Data":"1e7e91974f3cc6458f781703e9c5006f59a4acd8cce645e7358e0c06d4c2897d"} Mar 18 13:20:25 crc kubenswrapper[4912]: I0318 13:20:25.293296 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:25 crc kubenswrapper[4912]: I0318 13:20:25.320561 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podStartSLOduration=1.333361576 podStartE2EDuration="7.320534929s" podCreationTimestamp="2026-03-18 13:20:18 +0000 UTC" firstStartedPulling="2026-03-18 13:20:19.045861829 +0000 UTC m=+1067.505289254" lastFinishedPulling="2026-03-18 13:20:25.033035182 +0000 UTC m=+1073.492462607" observedRunningTime="2026-03-18 13:20:25.314129945 +0000 UTC m=+1073.773557380" watchObservedRunningTime="2026-03-18 13:20:25.320534929 +0000 UTC m=+1073.779962374" Mar 18 13:20:33 crc kubenswrapper[4912]: I0318 13:20:33.955659 4912 scope.go:117] "RemoveContainer" containerID="672fdc1df7dbc5fc0bdab535714f3ef79f01dbb702b11cd691edee75dcabc545" Mar 18 13:20:36 crc kubenswrapper[4912]: I0318 13:20:36.999203 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:20:37 crc kubenswrapper[4912]: I0318 13:20:36.999943 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:20:38 crc kubenswrapper[4912]: I0318 13:20:38.529095 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.046462 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.915321 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd"] Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.916766 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.919920 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.920027 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-929bs" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.928246 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ngzqk"] Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.931497 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.944498 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.944757 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 13:20:58 crc kubenswrapper[4912]: I0318 13:20:58.954869 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd"] Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.027629 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zw7wz"] Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.029126 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.032144 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.032182 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.032262 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nf4dc" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.032739 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042203 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0475f7b9-387c-422d-88c8-90416895b720-frr-startup\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042263 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0475f7b9-387c-422d-88c8-90416895b720-metrics-certs\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042304 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwjdx\" (UniqueName: \"kubernetes.io/projected/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-kube-api-access-zwjdx\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042404 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxk2\" (UniqueName: \"kubernetes.io/projected/0475f7b9-387c-422d-88c8-90416895b720-kube-api-access-wcxk2\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042434 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-reloader\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042455 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-metrics\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042485 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-conf\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042512 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-sockets\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.042667 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.048426 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-tpm8v"] Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.049891 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.057759 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.065664 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tpm8v"] Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.144819 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxk2\" (UniqueName: \"kubernetes.io/projected/0475f7b9-387c-422d-88c8-90416895b720-kube-api-access-wcxk2\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.144891 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-reloader\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.144929 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-metrics\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.144961 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6af6424-58bd-4c40-a86c-15627b762a9a-metallb-excludel2\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.144990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-metrics-certs\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145016 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-conf\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145060 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-sockets\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145092 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-cert\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145118 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvlg\" (UniqueName: \"kubernetes.io/projected/1c9a2194-27ba-4a86-b5c1-e8356c71227f-kube-api-access-bnvlg\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145142 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145166 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.145308 4912 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.145379 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert podName:7d7516e2-d2c4-4f18-9cc6-d2aad94db27e nodeName:}" failed. No retries permitted until 2026-03-18 13:20:59.645354111 +0000 UTC m=+1108.104781536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert") pod "frr-k8s-webhook-server-bcc4b6f68-grpwd" (UID: "7d7516e2-d2c4-4f18-9cc6-d2aad94db27e") : secret "frr-k8s-webhook-server-cert" not found Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145401 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145451 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgrh\" (UniqueName: \"kubernetes.io/projected/c6af6424-58bd-4c40-a86c-15627b762a9a-kube-api-access-xlgrh\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145480 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0475f7b9-387c-422d-88c8-90416895b720-frr-startup\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145511 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0475f7b9-387c-422d-88c8-90416895b720-metrics-certs\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145554 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwjdx\" (UniqueName: \"kubernetes.io/projected/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-kube-api-access-zwjdx\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145741 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-metrics\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145753 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-reloader\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145800 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-conf\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.145891 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0475f7b9-387c-422d-88c8-90416895b720-frr-sockets\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.146858 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0475f7b9-387c-422d-88c8-90416895b720-frr-startup\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.154560 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0475f7b9-387c-422d-88c8-90416895b720-metrics-certs\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.167725 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwjdx\" (UniqueName: \"kubernetes.io/projected/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-kube-api-access-zwjdx\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.170858 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxk2\" (UniqueName: \"kubernetes.io/projected/0475f7b9-387c-422d-88c8-90416895b720-kube-api-access-wcxk2\") pod \"frr-k8s-ngzqk\" (UID: \"0475f7b9-387c-422d-88c8-90416895b720\") " pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.247815 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-cert\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.247867 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvlg\" (UniqueName: \"kubernetes.io/projected/1c9a2194-27ba-4a86-b5c1-e8356c71227f-kube-api-access-bnvlg\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.247892 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.247937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.247981 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgrh\" (UniqueName: \"kubernetes.io/projected/c6af6424-58bd-4c40-a86c-15627b762a9a-kube-api-access-xlgrh\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.248084 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6af6424-58bd-4c40-a86c-15627b762a9a-metallb-excludel2\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.248108 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-metrics-certs\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.248158 4912 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.248207 4912 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.248252 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist podName:c6af6424-58bd-4c40-a86c-15627b762a9a nodeName:}" failed. No retries permitted until 2026-03-18 13:20:59.748227485 +0000 UTC m=+1108.207654910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist") pod "speaker-zw7wz" (UID: "c6af6424-58bd-4c40-a86c-15627b762a9a") : secret "metallb-memberlist" not found Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.248322 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs podName:c6af6424-58bd-4c40-a86c-15627b762a9a nodeName:}" failed. No retries permitted until 2026-03-18 13:20:59.748289667 +0000 UTC m=+1108.207717282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs") pod "speaker-zw7wz" (UID: "c6af6424-58bd-4c40-a86c-15627b762a9a") : secret "speaker-certs-secret" not found Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.249374 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6af6424-58bd-4c40-a86c-15627b762a9a-metallb-excludel2\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.256703 4912 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.257575 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-metrics-certs\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.262227 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c9a2194-27ba-4a86-b5c1-e8356c71227f-cert\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.266442 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.278177 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvlg\" (UniqueName: \"kubernetes.io/projected/1c9a2194-27ba-4a86-b5c1-e8356c71227f-kube-api-access-bnvlg\") pod \"controller-7bb4cc7c98-tpm8v\" (UID: \"1c9a2194-27ba-4a86-b5c1-e8356c71227f\") " pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.281843 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgrh\" (UniqueName: \"kubernetes.io/projected/c6af6424-58bd-4c40-a86c-15627b762a9a-kube-api-access-xlgrh\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.374765 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.468394 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.580729 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"f14e811903de89bda9530a68abca36eb3e6f6ddcd48b8e1890ce9919d752c0ee"} Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.657572 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.664913 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d7516e2-d2c4-4f18-9cc6-d2aad94db27e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-grpwd\" (UID: \"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.760466 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.760535 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.760739 4912 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 13:20:59 crc kubenswrapper[4912]: E0318 13:20:59.760863 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist podName:c6af6424-58bd-4c40-a86c-15627b762a9a nodeName:}" failed. No retries permitted until 2026-03-18 13:21:00.760835096 +0000 UTC m=+1109.220262521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist") pod "speaker-zw7wz" (UID: "c6af6424-58bd-4c40-a86c-15627b762a9a") : secret "metallb-memberlist" not found Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.765913 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-metrics-certs\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.843963 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:20:59 crc kubenswrapper[4912]: I0318 13:20:59.902151 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tpm8v"] Mar 18 13:20:59 crc kubenswrapper[4912]: W0318 13:20:59.906220 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c9a2194_27ba_4a86_b5c1_e8356c71227f.slice/crio-05f72077f99e331c4e3438cde3cf696446f5eb3966c99e18603d5d08164fc521 WatchSource:0}: Error finding container 05f72077f99e331c4e3438cde3cf696446f5eb3966c99e18603d5d08164fc521: Status 404 returned error can't find the container with id 05f72077f99e331c4e3438cde3cf696446f5eb3966c99e18603d5d08164fc521 Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.249767 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd"] Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.592759 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tpm8v" event={"ID":"1c9a2194-27ba-4a86-b5c1-e8356c71227f","Type":"ContainerStarted","Data":"6cadb388f9eccac0fa98e76cb257abb65d35518ce3ed54292ec95ecf12750c7c"} Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.593240 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tpm8v" event={"ID":"1c9a2194-27ba-4a86-b5c1-e8356c71227f","Type":"ContainerStarted","Data":"730b81e54e047c5b4e37870489cf5c8c6c9c26326dfd4bfeb001018f1280b1d8"} Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.593264 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.593305 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tpm8v" event={"ID":"1c9a2194-27ba-4a86-b5c1-e8356c71227f","Type":"ContainerStarted","Data":"05f72077f99e331c4e3438cde3cf696446f5eb3966c99e18603d5d08164fc521"} Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.594197 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" event={"ID":"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e","Type":"ContainerStarted","Data":"534d70e5208f205bdb010777f6514d8ca74e7548683145e35a7d0d06cb731bab"} Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.785543 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.795252 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6af6424-58bd-4c40-a86c-15627b762a9a-memberlist\") pod \"speaker-zw7wz\" (UID: \"c6af6424-58bd-4c40-a86c-15627b762a9a\") " pod="metallb-system/speaker-zw7wz" Mar 18 13:21:00 crc kubenswrapper[4912]: I0318 13:21:00.852970 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zw7wz" Mar 18 13:21:00 crc kubenswrapper[4912]: W0318 13:21:00.903539 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6af6424_58bd_4c40_a86c_15627b762a9a.slice/crio-47c8f9743a6dea97b2d7088b8be7690ed16374da177bd031b0fecd1d9782e904 WatchSource:0}: Error finding container 47c8f9743a6dea97b2d7088b8be7690ed16374da177bd031b0fecd1d9782e904: Status 404 returned error can't find the container with id 47c8f9743a6dea97b2d7088b8be7690ed16374da177bd031b0fecd1d9782e904 Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.611048 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zw7wz" event={"ID":"c6af6424-58bd-4c40-a86c-15627b762a9a","Type":"ContainerStarted","Data":"7b3eb3d2157ccae367ade267ca86de385fe736c56c07d94cc93d635ba40a0754"} Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.611407 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zw7wz" event={"ID":"c6af6424-58bd-4c40-a86c-15627b762a9a","Type":"ContainerStarted","Data":"1865d794c53fc9c0b08efce4bbd193e08db3f1df48dc3e4396739edf9e48cbf4"} Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.611419 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zw7wz" event={"ID":"c6af6424-58bd-4c40-a86c-15627b762a9a","Type":"ContainerStarted","Data":"47c8f9743a6dea97b2d7088b8be7690ed16374da177bd031b0fecd1d9782e904"} Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.611632 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zw7wz" Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.636355 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-tpm8v" podStartSLOduration=2.636329537 podStartE2EDuration="2.636329537s" podCreationTimestamp="2026-03-18 13:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:21:00.611087926 +0000 UTC m=+1109.070515381" watchObservedRunningTime="2026-03-18 13:21:01.636329537 +0000 UTC m=+1110.095756962" Mar 18 13:21:01 crc kubenswrapper[4912]: I0318 13:21:01.638191 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zw7wz" podStartSLOduration=2.638182198 podStartE2EDuration="2.638182198s" podCreationTimestamp="2026-03-18 13:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:21:01.634646232 +0000 UTC m=+1110.094073677" watchObservedRunningTime="2026-03-18 13:21:01.638182198 +0000 UTC m=+1110.097609623" Mar 18 13:21:06 crc kubenswrapper[4912]: I0318 13:21:06.999134 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:21:07 crc kubenswrapper[4912]: I0318 13:21:06.999885 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:21:09 crc kubenswrapper[4912]: I0318 13:21:09.749483 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" event={"ID":"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e","Type":"ContainerStarted","Data":"a3882757bbf19032efcdfc98738b4713ab8eb41ae9cfa9af6fc0a0163e543a3b"} Mar 18 13:21:09 crc kubenswrapper[4912]: I0318 13:21:09.750448 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:21:09 crc kubenswrapper[4912]: I0318 13:21:09.758008 4912 generic.go:334] "Generic (PLEG): container finished" podID="0475f7b9-387c-422d-88c8-90416895b720" containerID="541f5f6c6f53516207d7f8a156092560bf0996940fd1a42396bf6094535c7870" exitCode=0 Mar 18 13:21:09 crc kubenswrapper[4912]: I0318 13:21:09.758106 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerDied","Data":"541f5f6c6f53516207d7f8a156092560bf0996940fd1a42396bf6094535c7870"} Mar 18 13:21:09 crc kubenswrapper[4912]: I0318 13:21:09.787920 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podStartSLOduration=3.166025806 podStartE2EDuration="11.787887597s" podCreationTimestamp="2026-03-18 13:20:58 +0000 UTC" firstStartedPulling="2026-03-18 13:21:00.275946204 +0000 UTC m=+1108.735373629" lastFinishedPulling="2026-03-18 13:21:08.897807995 +0000 UTC m=+1117.357235420" observedRunningTime="2026-03-18 13:21:09.780007983 +0000 UTC m=+1118.239435408" watchObservedRunningTime="2026-03-18 13:21:09.787887597 +0000 UTC m=+1118.247315022" Mar 18 13:21:10 crc kubenswrapper[4912]: I0318 13:21:10.770360 4912 generic.go:334] "Generic (PLEG): container finished" podID="0475f7b9-387c-422d-88c8-90416895b720" containerID="576442177ba0be36fda19df16acdaa11d7d7ce1e75ea3eb13ac15468fceea37d" exitCode=0 Mar 18 13:21:10 crc kubenswrapper[4912]: I0318 13:21:10.770467 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerDied","Data":"576442177ba0be36fda19df16acdaa11d7d7ce1e75ea3eb13ac15468fceea37d"} Mar 18 13:21:11 crc kubenswrapper[4912]: I0318 13:21:11.786374 4912 generic.go:334] "Generic (PLEG): container finished" podID="0475f7b9-387c-422d-88c8-90416895b720" containerID="732fe0efd9e37a4bb07940e644218a47c61456e26cbc3341aca011c19572bdcf" exitCode=0 Mar 18 13:21:11 crc kubenswrapper[4912]: I0318 13:21:11.786467 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerDied","Data":"732fe0efd9e37a4bb07940e644218a47c61456e26cbc3341aca011c19572bdcf"} Mar 18 13:21:12 crc kubenswrapper[4912]: I0318 13:21:12.801986 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"2acdd1e9269f4625660ce9198fe462b6eed2646e50fdd718f9d96bb95f254fc6"} Mar 18 13:21:12 crc kubenswrapper[4912]: I0318 13:21:12.803173 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"7d4e1f2d38a6a7d5410c0e7449f530ffae662f5fe9da4ffae14b336e5e691909"} Mar 18 13:21:12 crc kubenswrapper[4912]: I0318 13:21:12.803191 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"eb36b16a7c0e243fe904694742ec7e11a72b33aea2b15888a04c8540a42db45c"} Mar 18 13:21:12 crc kubenswrapper[4912]: I0318 13:21:12.803201 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"6ddb7fb8b94f5302e47ce0b36cc2f0df9ccb93c03581f10ead06f49e4d8a813b"} Mar 18 13:21:13 crc kubenswrapper[4912]: I0318 13:21:13.815322 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"3cc31d988a8fd52c8e8156349fd129c35623d585873c4d994fc78c4f7379f59d"} Mar 18 13:21:13 crc kubenswrapper[4912]: I0318 13:21:13.815988 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"46a04f59320a66aedbfd8538fdf66b2ddd54d69b2a20ebf9a8e688ad6ff9c3e7"} Mar 18 13:21:13 crc kubenswrapper[4912]: I0318 13:21:13.816016 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:21:13 crc kubenswrapper[4912]: I0318 13:21:13.854829 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ngzqk" podStartSLOduration=6.453077332 podStartE2EDuration="15.854805361s" podCreationTimestamp="2026-03-18 13:20:58 +0000 UTC" firstStartedPulling="2026-03-18 13:20:59.467994483 +0000 UTC m=+1107.927421908" lastFinishedPulling="2026-03-18 13:21:08.869722512 +0000 UTC m=+1117.329149937" observedRunningTime="2026-03-18 13:21:13.84778026 +0000 UTC m=+1122.307207695" watchObservedRunningTime="2026-03-18 13:21:13.854805361 +0000 UTC m=+1122.314232786" Mar 18 13:21:14 crc kubenswrapper[4912]: I0318 13:21:14.266705 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:21:14 crc kubenswrapper[4912]: I0318 13:21:14.337917 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:21:19 crc kubenswrapper[4912]: I0318 13:21:19.384882 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-tpm8v" Mar 18 13:21:19 crc kubenswrapper[4912]: I0318 13:21:19.850068 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 13:21:20 crc kubenswrapper[4912]: I0318 13:21:20.857365 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zw7wz" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.059565 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.061273 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.067544 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2hvzw" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.068595 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.069017 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.085390 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.228845 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k6kq\" (UniqueName: \"kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq\") pod \"openstack-operator-index-hnx8t\" (UID: \"6996f4d5-4808-4189-8158-6c231efd0df3\") " pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.330837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k6kq\" (UniqueName: \"kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq\") pod \"openstack-operator-index-hnx8t\" (UID: \"6996f4d5-4808-4189-8158-6c231efd0df3\") " pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.367332 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k6kq\" (UniqueName: \"kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq\") pod \"openstack-operator-index-hnx8t\" (UID: \"6996f4d5-4808-4189-8158-6c231efd0df3\") " pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.384166 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:24 crc kubenswrapper[4912]: I0318 13:21:24.911258 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:24 crc kubenswrapper[4912]: W0318 13:21:24.917645 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6996f4d5_4808_4189_8158_6c231efd0df3.slice/crio-41ba55b2df76ea4cd3bea6d07fb63593db5f1ad6823a865d4c94fbcbfdecdeb3 WatchSource:0}: Error finding container 41ba55b2df76ea4cd3bea6d07fb63593db5f1ad6823a865d4c94fbcbfdecdeb3: Status 404 returned error can't find the container with id 41ba55b2df76ea4cd3bea6d07fb63593db5f1ad6823a865d4c94fbcbfdecdeb3 Mar 18 13:21:25 crc kubenswrapper[4912]: I0318 13:21:25.941558 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hnx8t" event={"ID":"6996f4d5-4808-4189-8158-6c231efd0df3","Type":"ContainerStarted","Data":"41ba55b2df76ea4cd3bea6d07fb63593db5f1ad6823a865d4c94fbcbfdecdeb3"} Mar 18 13:21:27 crc kubenswrapper[4912]: I0318 13:21:27.412991 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.020248 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tkt7x"] Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.022088 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.037293 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tkt7x"] Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.120100 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnp6l\" (UniqueName: \"kubernetes.io/projected/10c9b954-d1cb-4055-a082-5b06828b5faa-kube-api-access-qnp6l\") pod \"openstack-operator-index-tkt7x\" (UID: \"10c9b954-d1cb-4055-a082-5b06828b5faa\") " pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.222603 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnp6l\" (UniqueName: \"kubernetes.io/projected/10c9b954-d1cb-4055-a082-5b06828b5faa-kube-api-access-qnp6l\") pod \"openstack-operator-index-tkt7x\" (UID: \"10c9b954-d1cb-4055-a082-5b06828b5faa\") " pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.270848 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnp6l\" (UniqueName: \"kubernetes.io/projected/10c9b954-d1cb-4055-a082-5b06828b5faa-kube-api-access-qnp6l\") pod \"openstack-operator-index-tkt7x\" (UID: \"10c9b954-d1cb-4055-a082-5b06828b5faa\") " pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:28 crc kubenswrapper[4912]: I0318 13:21:28.349462 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.269797 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ngzqk" Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.476655 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tkt7x"] Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.980828 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tkt7x" event={"ID":"10c9b954-d1cb-4055-a082-5b06828b5faa","Type":"ContainerStarted","Data":"b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00"} Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.981254 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tkt7x" event={"ID":"10c9b954-d1cb-4055-a082-5b06828b5faa","Type":"ContainerStarted","Data":"2210c9bec5f32265f7ce037b482ca4b0ac22fa049ec7f6536bd864b7163ab134"} Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.982578 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hnx8t" event={"ID":"6996f4d5-4808-4189-8158-6c231efd0df3","Type":"ContainerStarted","Data":"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4"} Mar 18 13:21:29 crc kubenswrapper[4912]: I0318 13:21:29.982709 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hnx8t" podUID="6996f4d5-4808-4189-8158-6c231efd0df3" containerName="registry-server" containerID="cri-o://623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4" gracePeriod=2 Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.033357 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hnx8t" podStartSLOduration=1.8729138330000001 podStartE2EDuration="6.033322566s" podCreationTimestamp="2026-03-18 13:21:24 +0000 UTC" firstStartedPulling="2026-03-18 13:21:24.922358689 +0000 UTC m=+1133.381786104" lastFinishedPulling="2026-03-18 13:21:29.082767412 +0000 UTC m=+1137.542194837" observedRunningTime="2026-03-18 13:21:30.02794581 +0000 UTC m=+1138.487373235" watchObservedRunningTime="2026-03-18 13:21:30.033322566 +0000 UTC m=+1138.492750001" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.039447 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tkt7x" podStartSLOduration=1.9896886999999999 podStartE2EDuration="2.039431001s" podCreationTimestamp="2026-03-18 13:21:28 +0000 UTC" firstStartedPulling="2026-03-18 13:21:29.483449734 +0000 UTC m=+1137.942877159" lastFinishedPulling="2026-03-18 13:21:29.533192005 +0000 UTC m=+1137.992619460" observedRunningTime="2026-03-18 13:21:30.014285859 +0000 UTC m=+1138.473713284" watchObservedRunningTime="2026-03-18 13:21:30.039431001 +0000 UTC m=+1138.498858436" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.575624 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.678763 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k6kq\" (UniqueName: \"kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq\") pod \"6996f4d5-4808-4189-8158-6c231efd0df3\" (UID: \"6996f4d5-4808-4189-8158-6c231efd0df3\") " Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.687729 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq" (OuterVolumeSpecName: "kube-api-access-8k6kq") pod "6996f4d5-4808-4189-8158-6c231efd0df3" (UID: "6996f4d5-4808-4189-8158-6c231efd0df3"). InnerVolumeSpecName "kube-api-access-8k6kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.781304 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k6kq\" (UniqueName: \"kubernetes.io/projected/6996f4d5-4808-4189-8158-6c231efd0df3-kube-api-access-8k6kq\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.994068 4912 generic.go:334] "Generic (PLEG): container finished" podID="6996f4d5-4808-4189-8158-6c231efd0df3" containerID="623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4" exitCode=0 Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.994126 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hnx8t" Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.994170 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hnx8t" event={"ID":"6996f4d5-4808-4189-8158-6c231efd0df3","Type":"ContainerDied","Data":"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4"} Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.994199 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hnx8t" event={"ID":"6996f4d5-4808-4189-8158-6c231efd0df3","Type":"ContainerDied","Data":"41ba55b2df76ea4cd3bea6d07fb63593db5f1ad6823a865d4c94fbcbfdecdeb3"} Mar 18 13:21:30 crc kubenswrapper[4912]: I0318 13:21:30.994216 4912 scope.go:117] "RemoveContainer" containerID="623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4" Mar 18 13:21:31 crc kubenswrapper[4912]: I0318 13:21:31.026431 4912 scope.go:117] "RemoveContainer" containerID="623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4" Mar 18 13:21:31 crc kubenswrapper[4912]: E0318 13:21:31.027223 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4\": container with ID starting with 623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4 not found: ID does not exist" containerID="623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4" Mar 18 13:21:31 crc kubenswrapper[4912]: I0318 13:21:31.027279 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4"} err="failed to get container status \"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4\": rpc error: code = NotFound desc = could not find container \"623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4\": container with ID starting with 623c88b6eb1a80d5a32c9b350bf4672999aff02c9e5db54fe5b185dbc2ab23a4 not found: ID does not exist" Mar 18 13:21:31 crc kubenswrapper[4912]: I0318 13:21:31.033204 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:31 crc kubenswrapper[4912]: I0318 13:21:31.039708 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hnx8t"] Mar 18 13:21:32 crc kubenswrapper[4912]: I0318 13:21:32.238293 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6996f4d5-4808-4189-8158-6c231efd0df3" path="/var/lib/kubelet/pods/6996f4d5-4808-4189-8158-6c231efd0df3/volumes" Mar 18 13:21:36 crc kubenswrapper[4912]: I0318 13:21:36.999226 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:21:37 crc kubenswrapper[4912]: I0318 13:21:36.999939 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:21:37 crc kubenswrapper[4912]: I0318 13:21:37.000010 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:21:37 crc kubenswrapper[4912]: I0318 13:21:37.001014 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:21:37 crc kubenswrapper[4912]: I0318 13:21:37.001111 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df" gracePeriod=600 Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.061082 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df" exitCode=0 Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.061174 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df"} Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.061473 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9"} Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.061495 4912 scope.go:117] "RemoveContainer" containerID="05cdb0519bbf08b4e978264b2bdfdf4662c568a31d460058df90a82a4a831459" Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.350313 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.350703 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:38 crc kubenswrapper[4912]: I0318 13:21:38.384270 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:39 crc kubenswrapper[4912]: I0318 13:21:39.108071 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.515067 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt"] Mar 18 13:21:46 crc kubenswrapper[4912]: E0318 13:21:46.516190 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6996f4d5-4808-4189-8158-6c231efd0df3" containerName="registry-server" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.516210 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6996f4d5-4808-4189-8158-6c231efd0df3" containerName="registry-server" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.516414 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6996f4d5-4808-4189-8158-6c231efd0df3" containerName="registry-server" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.517857 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.527938 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tfkl2" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.529752 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt"] Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.611722 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.612147 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.612249 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969jg\" (UniqueName: \"kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.714223 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.714315 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.714359 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969jg\" (UniqueName: \"kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.715359 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.715544 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.755813 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969jg\" (UniqueName: \"kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg\") pod \"936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:46 crc kubenswrapper[4912]: I0318 13:21:46.835833 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:47 crc kubenswrapper[4912]: I0318 13:21:47.300405 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt"] Mar 18 13:21:48 crc kubenswrapper[4912]: I0318 13:21:48.162755 4912 generic.go:334] "Generic (PLEG): container finished" podID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerID="4276ddb56da96c5a16d60fec27be487ecc8ccfef23967894cff0fec53cb79838" exitCode=0 Mar 18 13:21:48 crc kubenswrapper[4912]: I0318 13:21:48.162832 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" event={"ID":"3982fa5f-f22b-4e44-8f14-3edfda813bb1","Type":"ContainerDied","Data":"4276ddb56da96c5a16d60fec27be487ecc8ccfef23967894cff0fec53cb79838"} Mar 18 13:21:48 crc kubenswrapper[4912]: I0318 13:21:48.163117 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" event={"ID":"3982fa5f-f22b-4e44-8f14-3edfda813bb1","Type":"ContainerStarted","Data":"0ac92a4ae492c251f0b5e75506e58f8a89c08ca4ce6161042b21cc76ccc762af"} Mar 18 13:21:49 crc kubenswrapper[4912]: I0318 13:21:49.173488 4912 generic.go:334] "Generic (PLEG): container finished" podID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerID="b4b43bf02eaf05c8c5d980525c1405b2b64ed821159a1419d917c11e365bab0f" exitCode=0 Mar 18 13:21:49 crc kubenswrapper[4912]: I0318 13:21:49.173696 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" event={"ID":"3982fa5f-f22b-4e44-8f14-3edfda813bb1","Type":"ContainerDied","Data":"b4b43bf02eaf05c8c5d980525c1405b2b64ed821159a1419d917c11e365bab0f"} Mar 18 13:21:50 crc kubenswrapper[4912]: I0318 13:21:50.184430 4912 generic.go:334] "Generic (PLEG): container finished" podID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerID="2279d1d5b8c846adbc5653654b6e9fdd7362f82a9a083d34ecd57374ddd07cb2" exitCode=0 Mar 18 13:21:50 crc kubenswrapper[4912]: I0318 13:21:50.184511 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" event={"ID":"3982fa5f-f22b-4e44-8f14-3edfda813bb1","Type":"ContainerDied","Data":"2279d1d5b8c846adbc5653654b6e9fdd7362f82a9a083d34ecd57374ddd07cb2"} Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.579543 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.720809 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle\") pod \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.721343 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-969jg\" (UniqueName: \"kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg\") pod \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.721493 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util\") pod \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\" (UID: \"3982fa5f-f22b-4e44-8f14-3edfda813bb1\") " Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.721851 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle" (OuterVolumeSpecName: "bundle") pod "3982fa5f-f22b-4e44-8f14-3edfda813bb1" (UID: "3982fa5f-f22b-4e44-8f14-3edfda813bb1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.722662 4912 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.730678 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg" (OuterVolumeSpecName: "kube-api-access-969jg") pod "3982fa5f-f22b-4e44-8f14-3edfda813bb1" (UID: "3982fa5f-f22b-4e44-8f14-3edfda813bb1"). InnerVolumeSpecName "kube-api-access-969jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.738810 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util" (OuterVolumeSpecName: "util") pod "3982fa5f-f22b-4e44-8f14-3edfda813bb1" (UID: "3982fa5f-f22b-4e44-8f14-3edfda813bb1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.824740 4912 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3982fa5f-f22b-4e44-8f14-3edfda813bb1-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:51 crc kubenswrapper[4912]: I0318 13:21:51.824785 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-969jg\" (UniqueName: \"kubernetes.io/projected/3982fa5f-f22b-4e44-8f14-3edfda813bb1-kube-api-access-969jg\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:52 crc kubenswrapper[4912]: I0318 13:21:52.206558 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" event={"ID":"3982fa5f-f22b-4e44-8f14-3edfda813bb1","Type":"ContainerDied","Data":"0ac92a4ae492c251f0b5e75506e58f8a89c08ca4ce6161042b21cc76ccc762af"} Mar 18 13:21:52 crc kubenswrapper[4912]: I0318 13:21:52.207072 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ac92a4ae492c251f0b5e75506e58f8a89c08ca4ce6161042b21cc76ccc762af" Mar 18 13:21:52 crc kubenswrapper[4912]: I0318 13:21:52.206670 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.579989 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq"] Mar 18 13:21:57 crc kubenswrapper[4912]: E0318 13:21:57.581203 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="util" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.581224 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="util" Mar 18 13:21:57 crc kubenswrapper[4912]: E0318 13:21:57.581257 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="extract" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.581265 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="extract" Mar 18 13:21:57 crc kubenswrapper[4912]: E0318 13:21:57.581280 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="pull" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.581292 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="pull" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.581466 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3982fa5f-f22b-4e44-8f14-3edfda813bb1" containerName="extract" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.582295 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.586111 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-9bqfb" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.608556 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq"] Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.633867 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfppk\" (UniqueName: \"kubernetes.io/projected/ef041eab-e584-4a2a-8008-9a7f07f75f70-kube-api-access-sfppk\") pod \"openstack-operator-controller-init-57c55bf5f4-gflkq\" (UID: \"ef041eab-e584-4a2a-8008-9a7f07f75f70\") " pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.735537 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfppk\" (UniqueName: \"kubernetes.io/projected/ef041eab-e584-4a2a-8008-9a7f07f75f70-kube-api-access-sfppk\") pod \"openstack-operator-controller-init-57c55bf5f4-gflkq\" (UID: \"ef041eab-e584-4a2a-8008-9a7f07f75f70\") " pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.772397 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfppk\" (UniqueName: \"kubernetes.io/projected/ef041eab-e584-4a2a-8008-9a7f07f75f70-kube-api-access-sfppk\") pod \"openstack-operator-controller-init-57c55bf5f4-gflkq\" (UID: \"ef041eab-e584-4a2a-8008-9a7f07f75f70\") " pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:21:57 crc kubenswrapper[4912]: I0318 13:21:57.905059 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:21:58 crc kubenswrapper[4912]: I0318 13:21:58.517771 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq"] Mar 18 13:21:59 crc kubenswrapper[4912]: I0318 13:21:59.275699 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" event={"ID":"ef041eab-e584-4a2a-8008-9a7f07f75f70","Type":"ContainerStarted","Data":"af013287c39bb30a24eba63b50ba3c3da9fb909edf84fffa8648fcc626fda097"} Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.176227 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564002-mtm5z"] Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.177654 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.182333 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.182574 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.182707 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.198829 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-mtm5z"] Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.289633 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbq4\" (UniqueName: \"kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4\") pod \"auto-csr-approver-29564002-mtm5z\" (UID: \"c3b2deea-af2a-420b-a2c1-6b109851ce15\") " pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.392547 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbq4\" (UniqueName: \"kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4\") pod \"auto-csr-approver-29564002-mtm5z\" (UID: \"c3b2deea-af2a-420b-a2c1-6b109851ce15\") " pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.418464 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbq4\" (UniqueName: \"kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4\") pod \"auto-csr-approver-29564002-mtm5z\" (UID: \"c3b2deea-af2a-420b-a2c1-6b109851ce15\") " pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:00 crc kubenswrapper[4912]: I0318 13:22:00.509742 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:03 crc kubenswrapper[4912]: I0318 13:22:03.326198 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" event={"ID":"ef041eab-e584-4a2a-8008-9a7f07f75f70","Type":"ContainerStarted","Data":"be844077f5223c55355d7502cf5d76e390848893355e1b0691a97951a3a89fd5"} Mar 18 13:22:03 crc kubenswrapper[4912]: I0318 13:22:03.328437 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:22:03 crc kubenswrapper[4912]: I0318 13:22:03.352553 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-mtm5z"] Mar 18 13:22:03 crc kubenswrapper[4912]: I0318 13:22:03.383567 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podStartSLOduration=1.989445145 podStartE2EDuration="6.383537854s" podCreationTimestamp="2026-03-18 13:21:57 +0000 UTC" firstStartedPulling="2026-03-18 13:21:58.525708432 +0000 UTC m=+1166.985135857" lastFinishedPulling="2026-03-18 13:22:02.919801141 +0000 UTC m=+1171.379228566" observedRunningTime="2026-03-18 13:22:03.377029658 +0000 UTC m=+1171.836457103" watchObservedRunningTime="2026-03-18 13:22:03.383537854 +0000 UTC m=+1171.842965299" Mar 18 13:22:04 crc kubenswrapper[4912]: I0318 13:22:04.337916 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" event={"ID":"c3b2deea-af2a-420b-a2c1-6b109851ce15","Type":"ContainerStarted","Data":"74336ec1a51c579e5cfbe1281ab8831e59801b504323678fe5d5e62d044ba6d6"} Mar 18 13:22:05 crc kubenswrapper[4912]: I0318 13:22:05.350208 4912 generic.go:334] "Generic (PLEG): container finished" podID="c3b2deea-af2a-420b-a2c1-6b109851ce15" containerID="f99c3661c41f36b7fce4ed7988b38fa923c7af3480b1e27b1820d0f3b4b5255b" exitCode=0 Mar 18 13:22:05 crc kubenswrapper[4912]: I0318 13:22:05.350289 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" event={"ID":"c3b2deea-af2a-420b-a2c1-6b109851ce15","Type":"ContainerDied","Data":"f99c3661c41f36b7fce4ed7988b38fa923c7af3480b1e27b1820d0f3b4b5255b"} Mar 18 13:22:06 crc kubenswrapper[4912]: I0318 13:22:06.682747 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:06 crc kubenswrapper[4912]: I0318 13:22:06.728919 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbq4\" (UniqueName: \"kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4\") pod \"c3b2deea-af2a-420b-a2c1-6b109851ce15\" (UID: \"c3b2deea-af2a-420b-a2c1-6b109851ce15\") " Mar 18 13:22:06 crc kubenswrapper[4912]: I0318 13:22:06.737662 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4" (OuterVolumeSpecName: "kube-api-access-xdbq4") pod "c3b2deea-af2a-420b-a2c1-6b109851ce15" (UID: "c3b2deea-af2a-420b-a2c1-6b109851ce15"). InnerVolumeSpecName "kube-api-access-xdbq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:06 crc kubenswrapper[4912]: I0318 13:22:06.831447 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbq4\" (UniqueName: \"kubernetes.io/projected/c3b2deea-af2a-420b-a2c1-6b109851ce15-kube-api-access-xdbq4\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:07 crc kubenswrapper[4912]: I0318 13:22:07.369628 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" event={"ID":"c3b2deea-af2a-420b-a2c1-6b109851ce15","Type":"ContainerDied","Data":"74336ec1a51c579e5cfbe1281ab8831e59801b504323678fe5d5e62d044ba6d6"} Mar 18 13:22:07 crc kubenswrapper[4912]: I0318 13:22:07.370107 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74336ec1a51c579e5cfbe1281ab8831e59801b504323678fe5d5e62d044ba6d6" Mar 18 13:22:07 crc kubenswrapper[4912]: I0318 13:22:07.369722 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-mtm5z" Mar 18 13:22:07 crc kubenswrapper[4912]: I0318 13:22:07.758812 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-qxbxk"] Mar 18 13:22:07 crc kubenswrapper[4912]: I0318 13:22:07.765178 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-qxbxk"] Mar 18 13:22:08 crc kubenswrapper[4912]: I0318 13:22:08.239835 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203f8736-4c2f-46af-8bfb-191c0d12f031" path="/var/lib/kubelet/pods/203f8736-4c2f-46af-8bfb-191c0d12f031/volumes" Mar 18 13:22:17 crc kubenswrapper[4912]: I0318 13:22:17.908933 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 13:22:34 crc kubenswrapper[4912]: I0318 13:22:34.078092 4912 scope.go:117] "RemoveContainer" containerID="9e65f7f2871c61a10f1b4fab46e56bd94dc25d496a74b4f891c21f68521ebf6c" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.003535 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69"] Mar 18 13:22:47 crc kubenswrapper[4912]: E0318 13:22:47.004880 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b2deea-af2a-420b-a2c1-6b109851ce15" containerName="oc" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.004909 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b2deea-af2a-420b-a2c1-6b109851ce15" containerName="oc" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.005230 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b2deea-af2a-420b-a2c1-6b109851ce15" containerName="oc" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.006210 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.008828 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c5s5w" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.015966 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.017879 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.021978 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fgb64" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.030957 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.060546 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.086610 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6plfz\" (UniqueName: \"kubernetes.io/projected/2faefcc2-b6a3-4dee-a077-af88038f3565-kube-api-access-6plfz\") pod \"cinder-operator-controller-manager-6d77645966-pcz7q\" (UID: \"2faefcc2-b6a3-4dee-a077-af88038f3565\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.086788 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fp2b\" (UniqueName: \"kubernetes.io/projected/13092522-58a7-4c49-9164-41523060735e-kube-api-access-7fp2b\") pod \"barbican-operator-controller-manager-5cfd84c587-5wn69\" (UID: \"13092522-58a7-4c49-9164-41523060735e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.087201 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.088795 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.091899 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s7klb" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.106924 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.108257 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.117862 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hw7jf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.163679 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.191473 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6plfz\" (UniqueName: \"kubernetes.io/projected/2faefcc2-b6a3-4dee-a077-af88038f3565-kube-api-access-6plfz\") pod \"cinder-operator-controller-manager-6d77645966-pcz7q\" (UID: \"2faefcc2-b6a3-4dee-a077-af88038f3565\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.191558 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fp2b\" (UniqueName: \"kubernetes.io/projected/13092522-58a7-4c49-9164-41523060735e-kube-api-access-7fp2b\") pod \"barbican-operator-controller-manager-5cfd84c587-5wn69\" (UID: \"13092522-58a7-4c49-9164-41523060735e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.191658 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwhxm\" (UniqueName: \"kubernetes.io/projected/1f17f2a1-55b9-493b-9a8a-3d53f21becb9-kube-api-access-cwhxm\") pod \"glance-operator-controller-manager-7d559dcdbd-65r9q\" (UID: \"1f17f2a1-55b9-493b-9a8a-3d53f21becb9\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.191714 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwwjg\" (UniqueName: \"kubernetes.io/projected/e7b90186-2a06-42a0-aec9-8d8f27dfe4dd-kube-api-access-qwwjg\") pod \"designate-operator-controller-manager-6cc65c69fc-glhjp\" (UID: \"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.197093 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.222983 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fp2b\" (UniqueName: \"kubernetes.io/projected/13092522-58a7-4c49-9164-41523060735e-kube-api-access-7fp2b\") pod \"barbican-operator-controller-manager-5cfd84c587-5wn69\" (UID: \"13092522-58a7-4c49-9164-41523060735e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.230143 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.232093 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.235487 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-5s6l2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.236024 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6plfz\" (UniqueName: \"kubernetes.io/projected/2faefcc2-b6a3-4dee-a077-af88038f3565-kube-api-access-6plfz\") pod \"cinder-operator-controller-manager-6d77645966-pcz7q\" (UID: \"2faefcc2-b6a3-4dee-a077-af88038f3565\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.265344 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.266625 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.270756 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.272480 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-64dgt" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.281021 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.294301 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwhxm\" (UniqueName: \"kubernetes.io/projected/1f17f2a1-55b9-493b-9a8a-3d53f21becb9-kube-api-access-cwhxm\") pod \"glance-operator-controller-manager-7d559dcdbd-65r9q\" (UID: \"1f17f2a1-55b9-493b-9a8a-3d53f21becb9\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.294385 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwwjg\" (UniqueName: \"kubernetes.io/projected/e7b90186-2a06-42a0-aec9-8d8f27dfe4dd-kube-api-access-qwwjg\") pod \"designate-operator-controller-manager-6cc65c69fc-glhjp\" (UID: \"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.294535 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gjkx\" (UniqueName: \"kubernetes.io/projected/98fed63c-9006-4589-a119-1e25fb115041-kube-api-access-8gjkx\") pod \"horizon-operator-controller-manager-64dc66d669-72xxs\" (UID: \"98fed63c-9006-4589-a119-1e25fb115041\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.294575 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc49b\" (UniqueName: \"kubernetes.io/projected/f695b268-a8b7-4b72-a37b-dd342d7d369a-kube-api-access-cc49b\") pod \"heat-operator-controller-manager-66dd9d474d-wljg2\" (UID: \"f695b268-a8b7-4b72-a37b-dd342d7d369a\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.321859 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.323352 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.326491 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.327755 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7kcwb" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.331647 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwhxm\" (UniqueName: \"kubernetes.io/projected/1f17f2a1-55b9-493b-9a8a-3d53f21becb9-kube-api-access-cwhxm\") pod \"glance-operator-controller-manager-7d559dcdbd-65r9q\" (UID: \"1f17f2a1-55b9-493b-9a8a-3d53f21becb9\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.332585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwwjg\" (UniqueName: \"kubernetes.io/projected/e7b90186-2a06-42a0-aec9-8d8f27dfe4dd-kube-api-access-qwwjg\") pod \"designate-operator-controller-manager-6cc65c69fc-glhjp\" (UID: \"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.353001 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.361277 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.392349 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.411637 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.420130 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.420234 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcnxr\" (UniqueName: \"kubernetes.io/projected/b7ec4270-842e-49cb-8d22-16df7b212443-kube-api-access-dcnxr\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.420439 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gjkx\" (UniqueName: \"kubernetes.io/projected/98fed63c-9006-4589-a119-1e25fb115041-kube-api-access-8gjkx\") pod \"horizon-operator-controller-manager-64dc66d669-72xxs\" (UID: \"98fed63c-9006-4589-a119-1e25fb115041\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.420505 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc49b\" (UniqueName: \"kubernetes.io/projected/f695b268-a8b7-4b72-a37b-dd342d7d369a-kube-api-access-cc49b\") pod \"heat-operator-controller-manager-66dd9d474d-wljg2\" (UID: \"f695b268-a8b7-4b72-a37b-dd342d7d369a\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.437121 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.444252 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-p7kgm" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.451657 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.457241 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.467995 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc49b\" (UniqueName: \"kubernetes.io/projected/f695b268-a8b7-4b72-a37b-dd342d7d369a-kube-api-access-cc49b\") pod \"heat-operator-controller-manager-66dd9d474d-wljg2\" (UID: \"f695b268-a8b7-4b72-a37b-dd342d7d369a\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.499852 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gjkx\" (UniqueName: \"kubernetes.io/projected/98fed63c-9006-4589-a119-1e25fb115041-kube-api-access-8gjkx\") pod \"horizon-operator-controller-manager-64dc66d669-72xxs\" (UID: \"98fed63c-9006-4589-a119-1e25fb115041\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.524361 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.530266 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.530787 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcnxr\" (UniqueName: \"kubernetes.io/projected/b7ec4270-842e-49cb-8d22-16df7b212443-kube-api-access-dcnxr\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: E0318 13:22:47.531468 4912 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:47 crc kubenswrapper[4912]: E0318 13:22:47.531611 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert podName:b7ec4270-842e-49cb-8d22-16df7b212443 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:48.031592828 +0000 UTC m=+1216.491020243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert") pod "infra-operator-controller-manager-5595c7d6ff-qsghf" (UID: "b7ec4270-842e-49cb-8d22-16df7b212443") : secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.531509 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.548313 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zgk2v" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.586656 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcnxr\" (UniqueName: \"kubernetes.io/projected/b7ec4270-842e-49cb-8d22-16df7b212443-kube-api-access-dcnxr\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.590004 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.611294 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.630657 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.632355 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/8b12ea77-cfde-4e3d-bdc7-04c350f17c09-kube-api-access-x9qkf\") pod \"keystone-operator-controller-manager-76b87776c9-vqtmj\" (UID: \"8b12ea77-cfde-4e3d-bdc7-04c350f17c09\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.632609 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6r2g\" (UniqueName: \"kubernetes.io/projected/334b170e-0f84-42b2-81a6-8c469d187fa3-kube-api-access-l6r2g\") pod \"ironic-operator-controller-manager-6b77b7676d-6ksxg\" (UID: \"334b170e-0f84-42b2-81a6-8c469d187fa3\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.651276 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.653267 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.658966 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qrbtf" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.677980 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.680026 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.683669 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c29qv" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.710611 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.717911 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.728346 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.730056 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.733995 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6r2g\" (UniqueName: \"kubernetes.io/projected/334b170e-0f84-42b2-81a6-8c469d187fa3-kube-api-access-l6r2g\") pod \"ironic-operator-controller-manager-6b77b7676d-6ksxg\" (UID: \"334b170e-0f84-42b2-81a6-8c469d187fa3\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.734136 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/8b12ea77-cfde-4e3d-bdc7-04c350f17c09-kube-api-access-x9qkf\") pod \"keystone-operator-controller-manager-76b87776c9-vqtmj\" (UID: \"8b12ea77-cfde-4e3d-bdc7-04c350f17c09\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.734215 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rxx\" (UniqueName: \"kubernetes.io/projected/f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0-kube-api-access-86rxx\") pod \"manila-operator-controller-manager-fbf7bbb96-zp69w\" (UID: \"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.747866 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j7dk2" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.752259 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.753863 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.756323 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.757185 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2zwtt" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.783755 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/8b12ea77-cfde-4e3d-bdc7-04c350f17c09-kube-api-access-x9qkf\") pod \"keystone-operator-controller-manager-76b87776c9-vqtmj\" (UID: \"8b12ea77-cfde-4e3d-bdc7-04c350f17c09\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.806424 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.808229 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6r2g\" (UniqueName: \"kubernetes.io/projected/334b170e-0f84-42b2-81a6-8c469d187fa3-kube-api-access-l6r2g\") pod \"ironic-operator-controller-manager-6b77b7676d-6ksxg\" (UID: \"334b170e-0f84-42b2-81a6-8c469d187fa3\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.845734 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7wx\" (UniqueName: \"kubernetes.io/projected/6ff20347-b4ef-4d01-966c-5ba69dcf546c-kube-api-access-cr7wx\") pod \"neutron-operator-controller-manager-6744dd545c-wm76g\" (UID: \"6ff20347-b4ef-4d01-966c-5ba69dcf546c\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.845843 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72x2n\" (UniqueName: \"kubernetes.io/projected/3821e364-991e-4a58-88e6-cf499d12aa70-kube-api-access-72x2n\") pod \"nova-operator-controller-manager-bc5c78db9-f95vk\" (UID: \"3821e364-991e-4a58-88e6-cf499d12aa70\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.845876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rxx\" (UniqueName: \"kubernetes.io/projected/f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0-kube-api-access-86rxx\") pod \"manila-operator-controller-manager-fbf7bbb96-zp69w\" (UID: \"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.845924 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkcwc\" (UniqueName: \"kubernetes.io/projected/6afa3dcd-776b-4472-9e54-31e102d2fb67-kube-api-access-nkcwc\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-l9d25\" (UID: \"6afa3dcd-776b-4472-9e54-31e102d2fb67\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.848384 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.848736 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.857953 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.859910 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.866495 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4sftj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.866891 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.876103 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.883405 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dv5jv" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.883641 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.891746 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.900455 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rxx\" (UniqueName: \"kubernetes.io/projected/f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0-kube-api-access-86rxx\") pod \"manila-operator-controller-manager-fbf7bbb96-zp69w\" (UID: \"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.905182 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.906632 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.907566 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.919398 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xflbk" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.951660 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7wx\" (UniqueName: \"kubernetes.io/projected/6ff20347-b4ef-4d01-966c-5ba69dcf546c-kube-api-access-cr7wx\") pod \"neutron-operator-controller-manager-6744dd545c-wm76g\" (UID: \"6ff20347-b4ef-4d01-966c-5ba69dcf546c\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.951873 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72x2n\" (UniqueName: \"kubernetes.io/projected/3821e364-991e-4a58-88e6-cf499d12aa70-kube-api-access-72x2n\") pod \"nova-operator-controller-manager-bc5c78db9-f95vk\" (UID: \"3821e364-991e-4a58-88e6-cf499d12aa70\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.951942 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkcwc\" (UniqueName: \"kubernetes.io/projected/6afa3dcd-776b-4472-9e54-31e102d2fb67-kube-api-access-nkcwc\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-l9d25\" (UID: \"6afa3dcd-776b-4472-9e54-31e102d2fb67\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.952013 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67rj\" (UniqueName: \"kubernetes.io/projected/45ef8022-adf2-46bc-a112-a5532880c080-kube-api-access-c67rj\") pod \"octavia-operator-controller-manager-56f74467c6-52z7q\" (UID: \"45ef8022-adf2-46bc-a112-a5532880c080\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.952998 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.966384 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.974941 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72x2n\" (UniqueName: \"kubernetes.io/projected/3821e364-991e-4a58-88e6-cf499d12aa70-kube-api-access-72x2n\") pod \"nova-operator-controller-manager-bc5c78db9-f95vk\" (UID: \"3821e364-991e-4a58-88e6-cf499d12aa70\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.974985 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7wx\" (UniqueName: \"kubernetes.io/projected/6ff20347-b4ef-4d01-966c-5ba69dcf546c-kube-api-access-cr7wx\") pod \"neutron-operator-controller-manager-6744dd545c-wm76g\" (UID: \"6ff20347-b4ef-4d01-966c-5ba69dcf546c\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.978583 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkcwc\" (UniqueName: \"kubernetes.io/projected/6afa3dcd-776b-4472-9e54-31e102d2fb67-kube-api-access-nkcwc\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-l9d25\" (UID: \"6afa3dcd-776b-4472-9e54-31e102d2fb67\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.982362 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9"] Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.983674 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.986076 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k4nk8" Mar 18 13:22:47 crc kubenswrapper[4912]: I0318 13:22:47.999839 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.002744 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.004306 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.006321 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vtbcr" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.017560 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.036010 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.036577 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.050548 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.052284 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.054181 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm8mb\" (UniqueName: \"kubernetes.io/projected/67ab4d42-cf77-45ce-9bf7-f0db056c4151-kube-api-access-gm8mb\") pod \"placement-operator-controller-manager-659fb58c6b-9kt49\" (UID: \"67ab4d42-cf77-45ce-9bf7-f0db056c4151\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.054236 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c67rj\" (UniqueName: \"kubernetes.io/projected/45ef8022-adf2-46bc-a112-a5532880c080-kube-api-access-c67rj\") pod \"octavia-operator-controller-manager-56f74467c6-52z7q\" (UID: \"45ef8022-adf2-46bc-a112-a5532880c080\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.054277 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6c2\" (UniqueName: \"kubernetes.io/projected/7ffd183f-20a4-4586-ac75-597797ada23c-kube-api-access-fn6c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.056664 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.056711 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.056855 4912 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.056907 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert podName:b7ec4270-842e-49cb-8d22-16df7b212443 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:49.056890594 +0000 UTC m=+1217.516318019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert") pod "infra-operator-controller-manager-5595c7d6ff-qsghf" (UID: "b7ec4270-842e-49cb-8d22-16df7b212443") : secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.057705 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-p9k9h" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.066567 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.068097 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.073125 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-s4svs" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.077664 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67rj\" (UniqueName: \"kubernetes.io/projected/45ef8022-adf2-46bc-a112-a5532880c080-kube-api-access-c67rj\") pod \"octavia-operator-controller-manager-56f74467c6-52z7q\" (UID: \"45ef8022-adf2-46bc-a112-a5532880c080\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.089311 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.109380 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.124939 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.126432 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.145352 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.147149 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.151514 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bkp6w" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158025 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwj8\" (UniqueName: \"kubernetes.io/projected/e5f93e56-4ca9-413c-9954-f94f182b6606-kube-api-access-ggwj8\") pod \"ovn-operator-controller-manager-846c4cdcb7-drnxt\" (UID: \"e5f93e56-4ca9-413c-9954-f94f182b6606\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158246 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6c2\" (UniqueName: \"kubernetes.io/projected/7ffd183f-20a4-4586-ac75-597797ada23c-kube-api-access-fn6c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158517 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchlw\" (UniqueName: \"kubernetes.io/projected/35ae7eba-4b8f-43ac-b828-5cbc84fed044-kube-api-access-xchlw\") pod \"telemetry-operator-controller-manager-54d55b7b75-h9lqp\" (UID: \"35ae7eba-4b8f-43ac-b828-5cbc84fed044\") " pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158770 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158853 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbwb\" (UniqueName: \"kubernetes.io/projected/2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2-kube-api-access-5bbwb\") pod \"swift-operator-controller-manager-867f54bc44-cllp9\" (UID: \"2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158901 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs68b\" (UniqueName: \"kubernetes.io/projected/692fb335-57d8-465c-b7ef-d94c53f84523-kube-api-access-bs68b\") pod \"test-operator-controller-manager-8467ccb4c8-6bzx6\" (UID: \"692fb335-57d8-465c-b7ef-d94c53f84523\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.158960 4912 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.159146 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert podName:7ffd183f-20a4-4586-ac75-597797ada23c nodeName:}" failed. No retries permitted until 2026-03-18 13:22:48.65912344 +0000 UTC m=+1217.118550945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" (UID: "7ffd183f-20a4-4586-ac75-597797ada23c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.158977 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm8mb\" (UniqueName: \"kubernetes.io/projected/67ab4d42-cf77-45ce-9bf7-f0db056c4151-kube-api-access-gm8mb\") pod \"placement-operator-controller-manager-659fb58c6b-9kt49\" (UID: \"67ab4d42-cf77-45ce-9bf7-f0db056c4151\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.165170 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.197109 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6c2\" (UniqueName: \"kubernetes.io/projected/7ffd183f-20a4-4586-ac75-597797ada23c-kube-api-access-fn6c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.200799 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm8mb\" (UniqueName: \"kubernetes.io/projected/67ab4d42-cf77-45ce-9bf7-f0db056c4151-kube-api-access-gm8mb\") pod \"placement-operator-controller-manager-659fb58c6b-9kt49\" (UID: \"67ab4d42-cf77-45ce-9bf7-f0db056c4151\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.280129 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwz9\" (UniqueName: \"kubernetes.io/projected/19eca4e0-1677-4af5-993a-4cd45173287e-kube-api-access-nzwz9\") pod \"watcher-operator-controller-manager-74d6f7b5c-vgvxb\" (UID: \"19eca4e0-1677-4af5-993a-4cd45173287e\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.280277 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchlw\" (UniqueName: \"kubernetes.io/projected/35ae7eba-4b8f-43ac-b828-5cbc84fed044-kube-api-access-xchlw\") pod \"telemetry-operator-controller-manager-54d55b7b75-h9lqp\" (UID: \"35ae7eba-4b8f-43ac-b828-5cbc84fed044\") " pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.280840 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbwb\" (UniqueName: \"kubernetes.io/projected/2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2-kube-api-access-5bbwb\") pod \"swift-operator-controller-manager-867f54bc44-cllp9\" (UID: \"2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.280902 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs68b\" (UniqueName: \"kubernetes.io/projected/692fb335-57d8-465c-b7ef-d94c53f84523-kube-api-access-bs68b\") pod \"test-operator-controller-manager-8467ccb4c8-6bzx6\" (UID: \"692fb335-57d8-465c-b7ef-d94c53f84523\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.281393 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwj8\" (UniqueName: \"kubernetes.io/projected/e5f93e56-4ca9-413c-9954-f94f182b6606-kube-api-access-ggwj8\") pod \"ovn-operator-controller-manager-846c4cdcb7-drnxt\" (UID: \"e5f93e56-4ca9-413c-9954-f94f182b6606\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.318815 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.320816 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs68b\" (UniqueName: \"kubernetes.io/projected/692fb335-57d8-465c-b7ef-d94c53f84523-kube-api-access-bs68b\") pod \"test-operator-controller-manager-8467ccb4c8-6bzx6\" (UID: \"692fb335-57d8-465c-b7ef-d94c53f84523\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.325852 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.326704 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwj8\" (UniqueName: \"kubernetes.io/projected/e5f93e56-4ca9-413c-9954-f94f182b6606-kube-api-access-ggwj8\") pod \"ovn-operator-controller-manager-846c4cdcb7-drnxt\" (UID: \"e5f93e56-4ca9-413c-9954-f94f182b6606\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.327669 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.329709 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchlw\" (UniqueName: \"kubernetes.io/projected/35ae7eba-4b8f-43ac-b828-5cbc84fed044-kube-api-access-xchlw\") pod \"telemetry-operator-controller-manager-54d55b7b75-h9lqp\" (UID: \"35ae7eba-4b8f-43ac-b828-5cbc84fed044\") " pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.330897 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.331320 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gghvn" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.335456 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbwb\" (UniqueName: \"kubernetes.io/projected/2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2-kube-api-access-5bbwb\") pod \"swift-operator-controller-manager-867f54bc44-cllp9\" (UID: \"2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.339431 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.374063 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.384442 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwz9\" (UniqueName: \"kubernetes.io/projected/19eca4e0-1677-4af5-993a-4cd45173287e-kube-api-access-nzwz9\") pod \"watcher-operator-controller-manager-74d6f7b5c-vgvxb\" (UID: \"19eca4e0-1677-4af5-993a-4cd45173287e\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.401174 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.403136 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.416349 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fvzzr" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.419736 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwz9\" (UniqueName: \"kubernetes.io/projected/19eca4e0-1677-4af5-993a-4cd45173287e-kube-api-access-nzwz9\") pod \"watcher-operator-controller-manager-74d6f7b5c-vgvxb\" (UID: \"19eca4e0-1677-4af5-993a-4cd45173287e\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.429999 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.446886 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l"] Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.487216 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.487376 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf74d\" (UniqueName: \"kubernetes.io/projected/d96a656e-5436-4af3-b4cd-98c485c402a1-kube-api-access-mf74d\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.487462 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.523649 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.590945 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.591103 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf74d\" (UniqueName: \"kubernetes.io/projected/d96a656e-5436-4af3-b4cd-98c485c402a1-kube-api-access-mf74d\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.591167 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrpv\" (UniqueName: \"kubernetes.io/projected/e00a6814-84ad-42fc-a5c5-b629750cfa80-kube-api-access-nlrpv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pwv5l\" (UID: \"e00a6814-84ad-42fc-a5c5-b629750cfa80\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.591204 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.591442 4912 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.591526 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:49.091498581 +0000 UTC m=+1217.550926006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "metrics-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.592015 4912 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.592116 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:49.092092987 +0000 UTC m=+1217.551520412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.614260 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf74d\" (UniqueName: \"kubernetes.io/projected/d96a656e-5436-4af3-b4cd-98c485c402a1-kube-api-access-mf74d\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.622858 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.651600 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.694087 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.694193 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrpv\" (UniqueName: \"kubernetes.io/projected/e00a6814-84ad-42fc-a5c5-b629750cfa80-kube-api-access-nlrpv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pwv5l\" (UID: \"e00a6814-84ad-42fc-a5c5-b629750cfa80\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.694976 4912 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: E0318 13:22:48.695148 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert podName:7ffd183f-20a4-4586-ac75-597797ada23c nodeName:}" failed. No retries permitted until 2026-03-18 13:22:49.695020843 +0000 UTC m=+1218.154448268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" (UID: "7ffd183f-20a4-4586-ac75-597797ada23c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.734281 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.734842 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrpv\" (UniqueName: \"kubernetes.io/projected/e00a6814-84ad-42fc-a5c5-b629750cfa80-kube-api-access-nlrpv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pwv5l\" (UID: \"e00a6814-84ad-42fc-a5c5-b629750cfa80\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.781237 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:22:48 crc kubenswrapper[4912]: I0318 13:22:48.925942 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.113659 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.114334 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.114580 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.115060 4912 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.115167 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:50.115141142 +0000 UTC m=+1218.574568567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "metrics-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.115903 4912 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.115936 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:50.115926643 +0000 UTC m=+1218.575354078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.116007 4912 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.116066 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert podName:b7ec4270-842e-49cb-8d22-16df7b212443 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:51.116056437 +0000 UTC m=+1219.575483862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert") pod "infra-operator-controller-manager-5595c7d6ff-qsghf" (UID: "b7ec4270-842e-49cb-8d22-16df7b212443") : secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.483332 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2"] Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.538802 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q"] Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.703697 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69"] Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.721357 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp"] Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.728638 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.728824 4912 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: E0318 13:22:49.728873 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert podName:7ffd183f-20a4-4586-ac75-597797ada23c nodeName:}" failed. No retries permitted until 2026-03-18 13:22:51.728856658 +0000 UTC m=+1220.188284083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" (UID: "7ffd183f-20a4-4586-ac75-597797ada23c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.856464 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" event={"ID":"2faefcc2-b6a3-4dee-a077-af88038f3565","Type":"ContainerStarted","Data":"fa71478f31b556d37a49c8718a59b582d3eb9444ba1f285f311e2491f7675e54"} Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.858920 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" event={"ID":"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd","Type":"ContainerStarted","Data":"57688e0d94d00002350e6d1bd6e63fc6608d64ff4e2d47e0704aae895da0ff0a"} Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.860551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" event={"ID":"f695b268-a8b7-4b72-a37b-dd342d7d369a","Type":"ContainerStarted","Data":"197fb2e417444e76c6912bc4fa8ae3b81bd3a24132cfffe02c1fc5f4dc514f5a"} Mar 18 13:22:49 crc kubenswrapper[4912]: I0318 13:22:49.861532 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" event={"ID":"13092522-58a7-4c49-9164-41523060735e","Type":"ContainerStarted","Data":"a967f61648a1de08e7ce0ab89336351203883c5ebcde54a3ce21ad84611a8031"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.147944 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.148161 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.148406 4912 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.148484 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:52.148462563 +0000 UTC m=+1220.607889988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "webhook-server-cert" not found Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.148923 4912 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.148954 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:52.148944286 +0000 UTC m=+1220.608371711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "metrics-server-cert" not found Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.448764 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.475550 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.516771 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.537428 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.555316 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.567870 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q"] Mar 18 13:22:50 crc kubenswrapper[4912]: W0318 13:22:50.574876 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afa3dcd_776b_4472_9e54_31e102d2fb67.slice/crio-307b067897b467ed763bd7084aa8d3457cbf39eb13d93fba4b6ccc0a6bc3fbfb WatchSource:0}: Error finding container 307b067897b467ed763bd7084aa8d3457cbf39eb13d93fba4b6ccc0a6bc3fbfb: Status 404 returned error can't find the container with id 307b067897b467ed763bd7084aa8d3457cbf39eb13d93fba4b6ccc0a6bc3fbfb Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.584237 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.593720 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.605704 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.822914 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.854584 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.873568 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.893392 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" event={"ID":"334b170e-0f84-42b2-81a6-8c469d187fa3","Type":"ContainerStarted","Data":"3b666693619c77478c2ae8d419d5da76160e3c7d0b50b193d8bd4ae8c822e39e"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.896963 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.914984 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.939486 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.939544 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" event={"ID":"6ff20347-b4ef-4d01-966c-5ba69dcf546c","Type":"ContainerStarted","Data":"68b580cae5f4e2369d04c117e4e112f0feecca1a0bd432678ee40195ea8ffcf1"} Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.939716 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5bbwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-867f54bc44-cllp9_openstack-operators(2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.939848 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggwj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-846c4cdcb7-drnxt_openstack-operators(e5f93e56-4ca9-413c-9954-f94f182b6606): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.941733 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.941819 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.943373 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gm8mb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-659fb58c6b-9kt49_openstack-operators(67ab4d42-cf77-45ce-9bf7-f0db056c4151): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.943680 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nzwz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-74d6f7b5c-vgvxb_openstack-operators(19eca4e0-1677-4af5-993a-4cd45173287e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.944596 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" Mar 18 13:22:50 crc kubenswrapper[4912]: E0318 13:22:50.945139 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.953004 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" event={"ID":"35ae7eba-4b8f-43ac-b828-5cbc84fed044","Type":"ContainerStarted","Data":"66dd359c81dc6ff3e5ec8038e5ff1fd6172fa74228806d4a47ab5a5cb626f507"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.953182 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9"] Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.962104 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" event={"ID":"3821e364-991e-4a58-88e6-cf499d12aa70","Type":"ContainerStarted","Data":"015398f3520953f28c8dc121e76f96d02084c49e71a3f897d26a9b368cb19717"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.965427 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" event={"ID":"98fed63c-9006-4589-a119-1e25fb115041","Type":"ContainerStarted","Data":"2a508e4801f35f1d2c312a2aa93db9122a5a701b2d3c317457d2a6f65538e9c7"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.967308 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" event={"ID":"692fb335-57d8-465c-b7ef-d94c53f84523","Type":"ContainerStarted","Data":"c62e58cdd02cb2c1a343ba008fc6925247d94c11b3bd47beb0f8860fc506487c"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.971842 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" event={"ID":"1f17f2a1-55b9-493b-9a8a-3d53f21becb9","Type":"ContainerStarted","Data":"3ed726eaf3f9f9f672440a87ffe93e7df3067aa6ddf98fbcbaae5ea7fa5c79e4"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.974089 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" event={"ID":"8b12ea77-cfde-4e3d-bdc7-04c350f17c09","Type":"ContainerStarted","Data":"c8ab22f98a719b8d465ff88ec6f91b60fb9ca541573318bc7ce9716f32575d43"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.976794 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" event={"ID":"6afa3dcd-776b-4472-9e54-31e102d2fb67","Type":"ContainerStarted","Data":"307b067897b467ed763bd7084aa8d3457cbf39eb13d93fba4b6ccc0a6bc3fbfb"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.978609 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" event={"ID":"45ef8022-adf2-46bc-a112-a5532880c080","Type":"ContainerStarted","Data":"d7f430d47f8a7998d1e9dd2dd28fd9eb2b989e8ad02409d3c6f6f9af9012bb64"} Mar 18 13:22:50 crc kubenswrapper[4912]: I0318 13:22:50.981117 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" event={"ID":"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0","Type":"ContainerStarted","Data":"f852df9e550ad8984f189237bc6940b5d4abbbfcfcdbca20f1658667ee3e395c"} Mar 18 13:22:51 crc kubenswrapper[4912]: I0318 13:22:51.135857 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:51 crc kubenswrapper[4912]: E0318 13:22:51.136052 4912 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:51 crc kubenswrapper[4912]: E0318 13:22:51.136121 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert podName:b7ec4270-842e-49cb-8d22-16df7b212443 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:55.136101775 +0000 UTC m=+1223.595529200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert") pod "infra-operator-controller-manager-5595c7d6ff-qsghf" (UID: "b7ec4270-842e-49cb-8d22-16df7b212443") : secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:51 crc kubenswrapper[4912]: I0318 13:22:51.748167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:51 crc kubenswrapper[4912]: E0318 13:22:51.748689 4912 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:51 crc kubenswrapper[4912]: E0318 13:22:51.752236 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert podName:7ffd183f-20a4-4586-ac75-597797ada23c nodeName:}" failed. No retries permitted until 2026-03-18 13:22:55.748777853 +0000 UTC m=+1224.208205278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" (UID: "7ffd183f-20a4-4586-ac75-597797ada23c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.012150 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" event={"ID":"19eca4e0-1677-4af5-993a-4cd45173287e","Type":"ContainerStarted","Data":"9f3ad0da8116b78d740d6de3dafbb2ab2073edbf9768b8828d1ad831c637f7f6"} Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.024450 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" event={"ID":"e00a6814-84ad-42fc-a5c5-b629750cfa80","Type":"ContainerStarted","Data":"0dfbc263572c36a215117b81ed503eb5e709721a4d31f44a755749803acc65f7"} Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.025745 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.029063 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" event={"ID":"67ab4d42-cf77-45ce-9bf7-f0db056c4151","Type":"ContainerStarted","Data":"2eaadaa1dab4aad2320bfd737de721c1ea34793cdea18bb5b8c71067f7dbb048"} Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.041922 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" event={"ID":"e5f93e56-4ca9-413c-9954-f94f182b6606","Type":"ContainerStarted","Data":"44943194ab891ff07fd815c656420b4499c9c1d490726c6c90e8b58363ab1105"} Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.042733 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.043527 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.044655 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" event={"ID":"2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2","Type":"ContainerStarted","Data":"4bc3bd4705e9833612f11f4c6781408468c66f0a3074445f1e449a868ba368b9"} Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.046492 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.163175 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.163391 4912 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 13:22:52 crc kubenswrapper[4912]: I0318 13:22:52.163446 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.163486 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:56.163461544 +0000 UTC m=+1224.622888969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "metrics-server-cert" not found Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.163708 4912 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 13:22:52 crc kubenswrapper[4912]: E0318 13:22:52.163921 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:22:56.163896795 +0000 UTC m=+1224.623324420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "webhook-server-cert" not found Mar 18 13:22:53 crc kubenswrapper[4912]: E0318 13:22:53.065325 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" Mar 18 13:22:53 crc kubenswrapper[4912]: E0318 13:22:53.066606 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" Mar 18 13:22:53 crc kubenswrapper[4912]: E0318 13:22:53.066696 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" Mar 18 13:22:53 crc kubenswrapper[4912]: E0318 13:22:53.069437 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" Mar 18 13:22:55 crc kubenswrapper[4912]: I0318 13:22:55.236138 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:22:55 crc kubenswrapper[4912]: E0318 13:22:55.236420 4912 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:55 crc kubenswrapper[4912]: E0318 13:22:55.236534 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert podName:b7ec4270-842e-49cb-8d22-16df7b212443 nodeName:}" failed. No retries permitted until 2026-03-18 13:23:03.236507977 +0000 UTC m=+1231.695935412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert") pod "infra-operator-controller-manager-5595c7d6ff-qsghf" (UID: "b7ec4270-842e-49cb-8d22-16df7b212443") : secret "infra-operator-webhook-server-cert" not found Mar 18 13:22:55 crc kubenswrapper[4912]: I0318 13:22:55.847824 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:22:55 crc kubenswrapper[4912]: E0318 13:22:55.848062 4912 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:55 crc kubenswrapper[4912]: E0318 13:22:55.848128 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert podName:7ffd183f-20a4-4586-ac75-597797ada23c nodeName:}" failed. No retries permitted until 2026-03-18 13:23:03.848106076 +0000 UTC m=+1232.307533501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" (UID: "7ffd183f-20a4-4586-ac75-597797ada23c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 13:22:56 crc kubenswrapper[4912]: I0318 13:22:56.258335 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:56 crc kubenswrapper[4912]: E0318 13:22:56.258695 4912 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 13:22:56 crc kubenswrapper[4912]: I0318 13:22:56.259031 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:22:56 crc kubenswrapper[4912]: E0318 13:22:56.259144 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:23:04.259112727 +0000 UTC m=+1232.718540152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "metrics-server-cert" not found Mar 18 13:22:56 crc kubenswrapper[4912]: E0318 13:22:56.259252 4912 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 13:22:56 crc kubenswrapper[4912]: E0318 13:22:56.259329 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs podName:d96a656e-5436-4af3-b4cd-98c485c402a1 nodeName:}" failed. No retries permitted until 2026-03-18 13:23:04.259307113 +0000 UTC m=+1232.718734548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs") pod "openstack-operator-controller-manager-585bd669c7-vrxh8" (UID: "d96a656e-5436-4af3-b4cd-98c485c402a1") : secret "webhook-server-cert" not found Mar 18 13:23:01 crc kubenswrapper[4912]: E0318 13:23:01.284211 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396" Mar 18 13:23:01 crc kubenswrapper[4912]: E0318 13:23:01.285753 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6plfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6d77645966-pcz7q_openstack-operators(2faefcc2-b6a3-4dee-a077-af88038f3565): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:01 crc kubenswrapper[4912]: E0318 13:23:01.286989 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podUID="2faefcc2-b6a3-4dee-a077-af88038f3565" Mar 18 13:23:02 crc kubenswrapper[4912]: E0318 13:23:02.004402 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e" Mar 18 13:23:02 crc kubenswrapper[4912]: E0318 13:23:02.004694 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qwwjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6cc65c69fc-glhjp_openstack-operators(e7b90186-2a06-42a0-aec9-8d8f27dfe4dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:02 crc kubenswrapper[4912]: E0318 13:23:02.009645 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" Mar 18 13:23:02 crc kubenswrapper[4912]: E0318 13:23:02.142839 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" Mar 18 13:23:02 crc kubenswrapper[4912]: E0318 13:23:02.145722 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podUID="2faefcc2-b6a3-4dee-a077-af88038f3565" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.257337 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.297329 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ec4270-842e-49cb-8d22-16df7b212443-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-qsghf\" (UID: \"b7ec4270-842e-49cb-8d22-16df7b212443\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.420062 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.870891 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.875503 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffd183f-20a4-4586-ac75-597797ada23c-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s\" (UID: \"7ffd183f-20a4-4586-ac75-597797ada23c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:23:03 crc kubenswrapper[4912]: I0318 13:23:03.931237 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:23:04 crc kubenswrapper[4912]: E0318 13:23:04.263443 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:362b121892b1f4d39a8e828d1c1a408f854a0228977f1e8115d7298d45c5c241" Mar 18 13:23:04 crc kubenswrapper[4912]: E0318 13:23:04.264385 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:362b121892b1f4d39a8e828d1c1a408f854a0228977f1e8115d7298d45c5c241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7fp2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-5cfd84c587-5wn69_openstack-operators(13092522-58a7-4c49-9164-41523060735e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:04 crc kubenswrapper[4912]: E0318 13:23:04.267306 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" Mar 18 13:23:04 crc kubenswrapper[4912]: I0318 13:23:04.279348 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:04 crc kubenswrapper[4912]: I0318 13:23:04.279576 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:04 crc kubenswrapper[4912]: I0318 13:23:04.283767 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-metrics-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:04 crc kubenswrapper[4912]: I0318 13:23:04.283774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d96a656e-5436-4af3-b4cd-98c485c402a1-webhook-certs\") pod \"openstack-operator-controller-manager-585bd669c7-vrxh8\" (UID: \"d96a656e-5436-4af3-b4cd-98c485c402a1\") " pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:04 crc kubenswrapper[4912]: I0318 13:23:04.423077 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:05 crc kubenswrapper[4912]: E0318 13:23:05.004054 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 18 13:23:05 crc kubenswrapper[4912]: E0318 13:23:05.004384 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-72x2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-f95vk_openstack-operators(3821e364-991e-4a58-88e6-cf499d12aa70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:05 crc kubenswrapper[4912]: E0318 13:23:05.005647 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" Mar 18 13:23:05 crc kubenswrapper[4912]: E0318 13:23:05.226137 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:362b121892b1f4d39a8e828d1c1a408f854a0228977f1e8115d7298d45c5c241\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" Mar 18 13:23:05 crc kubenswrapper[4912]: E0318 13:23:05.226677 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" Mar 18 13:23:06 crc kubenswrapper[4912]: E0318 13:23:06.067839 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4" Mar 18 13:23:06 crc kubenswrapper[4912]: E0318 13:23:06.068593 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c67rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-56f74467c6-52z7q_openstack-operators(45ef8022-adf2-46bc-a112-a5532880c080): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:06 crc kubenswrapper[4912]: E0318 13:23:06.069869 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" podUID="45ef8022-adf2-46bc-a112-a5532880c080" Mar 18 13:23:06 crc kubenswrapper[4912]: E0318 13:23:06.233452 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" podUID="45ef8022-adf2-46bc-a112-a5532880c080" Mar 18 13:23:09 crc kubenswrapper[4912]: E0318 13:23:09.606174 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8" Mar 18 13:23:09 crc kubenswrapper[4912]: E0318 13:23:09.607088 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nkcwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6f5b7bcd4-l9d25_openstack-operators(6afa3dcd-776b-4472-9e54-31e102d2fb67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:09 crc kubenswrapper[4912]: E0318 13:23:09.608476 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" Mar 18 13:23:10 crc kubenswrapper[4912]: E0318 13:23:10.273699 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" Mar 18 13:23:10 crc kubenswrapper[4912]: E0318 13:23:10.557521 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a" Mar 18 13:23:10 crc kubenswrapper[4912]: E0318 13:23:10.557769 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cwhxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-7d559dcdbd-65r9q_openstack-operators(1f17f2a1-55b9-493b-9a8a-3d53f21becb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:10 crc kubenswrapper[4912]: E0318 13:23:10.559156 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.166016 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.166286 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l6r2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6b77b7676d-6ksxg_openstack-operators(334b170e-0f84-42b2-81a6-8c469d187fa3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.167489 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podUID="334b170e-0f84-42b2-81a6-8c469d187fa3" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.287844 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.288013 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podUID="334b170e-0f84-42b2-81a6-8c469d187fa3" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.793600 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.793843 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x9qkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-76b87776c9-vqtmj_openstack-operators(8b12ea77-cfde-4e3d-bdc7-04c350f17c09): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:11 crc kubenswrapper[4912]: E0318 13:23:11.795130 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.291463 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.454787 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.455419 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nlrpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pwv5l_openstack-operators(e00a6814-84ad-42fc-a5c5-b629750cfa80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.457176 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" podUID="e00a6814-84ad-42fc-a5c5-b629750cfa80" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.523445 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.523528 4912 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.523736 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xchlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-54d55b7b75-h9lqp_openstack-operators(35ae7eba-4b8f-43ac-b828-5cbc84fed044): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:23:12 crc kubenswrapper[4912]: E0318 13:23:12.525437 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" Mar 18 13:23:13 crc kubenswrapper[4912]: E0318 13:23:13.299774 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" podUID="e00a6814-84ad-42fc-a5c5-b629750cfa80" Mar 18 13:23:13 crc kubenswrapper[4912]: E0318 13:23:13.300540 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" Mar 18 13:23:15 crc kubenswrapper[4912]: I0318 13:23:15.934128 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8"] Mar 18 13:23:15 crc kubenswrapper[4912]: I0318 13:23:15.943780 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf"] Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.059133 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s"] Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.328069 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" event={"ID":"f695b268-a8b7-4b72-a37b-dd342d7d369a","Type":"ContainerStarted","Data":"f09b36bfaee9e8ed70baeddc827771198905bf7caeb7cc1756ffcf377854370e"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.328175 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.333620 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" event={"ID":"e5f93e56-4ca9-413c-9954-f94f182b6606","Type":"ContainerStarted","Data":"ad09aabc54bd1c47bf6353a48b357e0a74f3340770f17640e3637c60b2a6f544"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.334146 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.336144 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" event={"ID":"2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2","Type":"ContainerStarted","Data":"535832fbeb91730792f09b9412f055e3c92132cf15fa78beb82c3c8f35455b2b"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.336386 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.342636 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" event={"ID":"19eca4e0-1677-4af5-993a-4cd45173287e","Type":"ContainerStarted","Data":"c063a3ba28dd231bb5c3c20876f23edc8f5d0cb51e83e3f7fe26d86e395ecd41"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.344279 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.360522 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" event={"ID":"6ff20347-b4ef-4d01-966c-5ba69dcf546c","Type":"ContainerStarted","Data":"9881ef1d6f722210a00ffd35d8692d65cbd085c6292c551ddd164ef4a9e9efde"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.360611 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.365062 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" event={"ID":"d96a656e-5436-4af3-b4cd-98c485c402a1","Type":"ContainerStarted","Data":"71d5a545bccdb1a856c6785cdaa745fe27f7e80f58ca2a33fb9fdc81a37f48c6"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.366708 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" event={"ID":"7ffd183f-20a4-4586-ac75-597797ada23c","Type":"ContainerStarted","Data":"ca2bc3067262512f6bce26f969cad3b80636b57372866588ed8244367130e83b"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.367996 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" event={"ID":"2faefcc2-b6a3-4dee-a077-af88038f3565","Type":"ContainerStarted","Data":"8829fefdc365d80d71ceeb78a39c2717f8e8a83b1808640b31d91a6c19c069f7"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.368965 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.373731 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" event={"ID":"b7ec4270-842e-49cb-8d22-16df7b212443","Type":"ContainerStarted","Data":"98c2e92f5b5f1fe6a2c37dba866f98c63973479ed6ba7ac2a003bd6e2369b4eb"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.381954 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" event={"ID":"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0","Type":"ContainerStarted","Data":"b70cb9746b9e2dce40680ff2d90f739d7ee651330834b5714502ea08dbc0f8aa"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.383245 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.393106 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" event={"ID":"98fed63c-9006-4589-a119-1e25fb115041","Type":"ContainerStarted","Data":"a1418d2c6a7c6ce8c89ee270bd44091f9a4eb4291e4d378d52ca09e5e39fa140"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.393860 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.406845 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" event={"ID":"67ab4d42-cf77-45ce-9bf7-f0db056c4151","Type":"ContainerStarted","Data":"25aa0d27b8703286979ccc95a6d60caa8a185ec8897641b6a34d16071bb0623f"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.407636 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.409700 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" event={"ID":"692fb335-57d8-465c-b7ef-d94c53f84523","Type":"ContainerStarted","Data":"a2838d2a3358ca2a78c27e005b22cc589be8c26b55d94ee076c37a6fd1abc08d"} Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.409808 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.409903 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" podStartSLOduration=5.462137926 podStartE2EDuration="29.409878084s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:49.511234388 +0000 UTC m=+1217.970661813" lastFinishedPulling="2026-03-18 13:23:13.458974556 +0000 UTC m=+1241.918401971" observedRunningTime="2026-03-18 13:23:16.386368952 +0000 UTC m=+1244.845796397" watchObservedRunningTime="2026-03-18 13:23:16.409878084 +0000 UTC m=+1244.869305509" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.416948 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" podStartSLOduration=6.511223454 podStartE2EDuration="29.416936654s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.550282156 +0000 UTC m=+1219.009709581" lastFinishedPulling="2026-03-18 13:23:13.455995356 +0000 UTC m=+1241.915422781" observedRunningTime="2026-03-18 13:23:16.415398852 +0000 UTC m=+1244.874826297" watchObservedRunningTime="2026-03-18 13:23:16.416936654 +0000 UTC m=+1244.876364069" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.458252 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podStartSLOduration=4.485080291 podStartE2EDuration="30.458230413s" podCreationTimestamp="2026-03-18 13:22:46 +0000 UTC" firstStartedPulling="2026-03-18 13:22:49.51092191 +0000 UTC m=+1217.970349335" lastFinishedPulling="2026-03-18 13:23:15.484072032 +0000 UTC m=+1243.943499457" observedRunningTime="2026-03-18 13:23:16.453679101 +0000 UTC m=+1244.913106536" watchObservedRunningTime="2026-03-18 13:23:16.458230413 +0000 UTC m=+1244.917657838" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.676672 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" podStartSLOduration=6.659460286 podStartE2EDuration="29.676595332s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.440797672 +0000 UTC m=+1218.900225097" lastFinishedPulling="2026-03-18 13:23:13.457932718 +0000 UTC m=+1241.917360143" observedRunningTime="2026-03-18 13:23:16.624661796 +0000 UTC m=+1245.084089251" watchObservedRunningTime="2026-03-18 13:23:16.676595332 +0000 UTC m=+1245.136022757" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.729370 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podStartSLOduration=5.120153358 podStartE2EDuration="29.72934787s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.943604187 +0000 UTC m=+1219.403031612" lastFinishedPulling="2026-03-18 13:23:15.552798699 +0000 UTC m=+1244.012226124" observedRunningTime="2026-03-18 13:23:16.726481253 +0000 UTC m=+1245.185908698" watchObservedRunningTime="2026-03-18 13:23:16.72934787 +0000 UTC m=+1245.188775305" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.871265 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podStartSLOduration=5.322804923 podStartE2EDuration="29.871242263s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.939521836 +0000 UTC m=+1219.398949261" lastFinishedPulling="2026-03-18 13:23:15.487959176 +0000 UTC m=+1243.947386601" observedRunningTime="2026-03-18 13:23:16.870273327 +0000 UTC m=+1245.329700752" watchObservedRunningTime="2026-03-18 13:23:16.871242263 +0000 UTC m=+1245.330669688" Mar 18 13:23:16 crc kubenswrapper[4912]: I0318 13:23:16.920526 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podStartSLOduration=5.408843325 podStartE2EDuration="29.920496697s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.939478865 +0000 UTC m=+1219.398906290" lastFinishedPulling="2026-03-18 13:23:15.451132237 +0000 UTC m=+1243.910559662" observedRunningTime="2026-03-18 13:23:16.908092954 +0000 UTC m=+1245.367520389" watchObservedRunningTime="2026-03-18 13:23:16.920496697 +0000 UTC m=+1245.379924122" Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.009629 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podStartSLOduration=5.498525808 podStartE2EDuration="30.009593562s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.943219747 +0000 UTC m=+1219.402647172" lastFinishedPulling="2026-03-18 13:23:15.454287501 +0000 UTC m=+1243.913714926" observedRunningTime="2026-03-18 13:23:16.995134473 +0000 UTC m=+1245.454561898" watchObservedRunningTime="2026-03-18 13:23:17.009593562 +0000 UTC m=+1245.469020987" Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.086778 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podStartSLOduration=7.469093749 podStartE2EDuration="30.086755496s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.841449073 +0000 UTC m=+1219.300876488" lastFinishedPulling="2026-03-18 13:23:13.45911081 +0000 UTC m=+1241.918538235" observedRunningTime="2026-03-18 13:23:17.075477563 +0000 UTC m=+1245.534904998" watchObservedRunningTime="2026-03-18 13:23:17.086755496 +0000 UTC m=+1245.546182921" Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.090670 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" podStartSLOduration=7.1546823 podStartE2EDuration="30.090662661s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.523259582 +0000 UTC m=+1218.982687007" lastFinishedPulling="2026-03-18 13:23:13.459239943 +0000 UTC m=+1241.918667368" observedRunningTime="2026-03-18 13:23:17.048117647 +0000 UTC m=+1245.507545102" watchObservedRunningTime="2026-03-18 13:23:17.090662661 +0000 UTC m=+1245.550090086" Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.422522 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" event={"ID":"d96a656e-5436-4af3-b4cd-98c485c402a1","Type":"ContainerStarted","Data":"be8d29d920d4c8847fdfdaf3077eebbc04f370ce4e05837acfa9d1fa5a476663"} Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.424760 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:17 crc kubenswrapper[4912]: I0318 13:23:17.474808 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" podStartSLOduration=30.474777273 podStartE2EDuration="30.474777273s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:23:17.461444095 +0000 UTC m=+1245.920871550" watchObservedRunningTime="2026-03-18 13:23:17.474777273 +0000 UTC m=+1245.934204728" Mar 18 13:23:18 crc kubenswrapper[4912]: I0318 13:23:18.447336 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" event={"ID":"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd","Type":"ContainerStarted","Data":"e5e4d7773dc1bbce8c3eb432d06c7b2d1b8e34356cf450d0f3924d04a6958ae8"} Mar 18 13:23:18 crc kubenswrapper[4912]: I0318 13:23:18.449568 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:23:18 crc kubenswrapper[4912]: I0318 13:23:18.475215 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podStartSLOduration=3.457526431 podStartE2EDuration="31.47517431s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:49.761435703 +0000 UTC m=+1218.220863128" lastFinishedPulling="2026-03-18 13:23:17.779083582 +0000 UTC m=+1246.238511007" observedRunningTime="2026-03-18 13:23:18.469555749 +0000 UTC m=+1246.928983174" watchObservedRunningTime="2026-03-18 13:23:18.47517431 +0000 UTC m=+1246.934601735" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.509345 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" event={"ID":"b7ec4270-842e-49cb-8d22-16df7b212443","Type":"ContainerStarted","Data":"3daec3fbcadd23641ff25d1437f3c81a9deef1c0d6ac0b0534c7d9021600671b"} Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.512546 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" event={"ID":"3821e364-991e-4a58-88e6-cf499d12aa70","Type":"ContainerStarted","Data":"d89a66518ba4c47e2bf682f15b4d61dc0c7f7e9e25a8fd2adf64629b8dd829af"} Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.513006 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.514989 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" event={"ID":"13092522-58a7-4c49-9164-41523060735e","Type":"ContainerStarted","Data":"37b49390da442ae7479fee7189cc6177ae463fef97b4a273c2ef012771885ab2"} Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.516135 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.517492 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" event={"ID":"45ef8022-adf2-46bc-a112-a5532880c080","Type":"ContainerStarted","Data":"a08dafbd9cbe4398bee406eeb92482a1177905b4efa74b2db5352cd6b973731c"} Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.517924 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.520407 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" event={"ID":"7ffd183f-20a4-4586-ac75-597797ada23c","Type":"ContainerStarted","Data":"f2010f0f0b777b55c4bd3f94257ee1b2cb3f040b26a6fc492adf3db776615ea2"} Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.520596 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.534750 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" podStartSLOduration=29.93173735 podStartE2EDuration="36.53471858s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:23:15.963640611 +0000 UTC m=+1244.423068036" lastFinishedPulling="2026-03-18 13:23:22.566621841 +0000 UTC m=+1251.026049266" observedRunningTime="2026-03-18 13:23:23.526727945 +0000 UTC m=+1251.986155370" watchObservedRunningTime="2026-03-18 13:23:23.53471858 +0000 UTC m=+1251.994146005" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.555463 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" podStartSLOduration=4.537359234 podStartE2EDuration="36.555442567s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.548725103 +0000 UTC m=+1219.008152528" lastFinishedPulling="2026-03-18 13:23:22.566808436 +0000 UTC m=+1251.026235861" observedRunningTime="2026-03-18 13:23:23.549886987 +0000 UTC m=+1252.009314412" watchObservedRunningTime="2026-03-18 13:23:23.555442567 +0000 UTC m=+1252.014869992" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.603494 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" podStartSLOduration=30.115500487 podStartE2EDuration="36.603473027s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:23:16.080164712 +0000 UTC m=+1244.539592137" lastFinishedPulling="2026-03-18 13:23:22.568137252 +0000 UTC m=+1251.027564677" observedRunningTime="2026-03-18 13:23:23.600652192 +0000 UTC m=+1252.060079637" watchObservedRunningTime="2026-03-18 13:23:23.603473027 +0000 UTC m=+1252.062900452" Mar 18 13:23:23 crc kubenswrapper[4912]: I0318 13:23:23.606460 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podStartSLOduration=4.780094655 podStartE2EDuration="37.606444937s" podCreationTimestamp="2026-03-18 13:22:46 +0000 UTC" firstStartedPulling="2026-03-18 13:22:49.742429967 +0000 UTC m=+1218.201857392" lastFinishedPulling="2026-03-18 13:23:22.568780249 +0000 UTC m=+1251.028207674" observedRunningTime="2026-03-18 13:23:23.572408223 +0000 UTC m=+1252.031835648" watchObservedRunningTime="2026-03-18 13:23:23.606444937 +0000 UTC m=+1252.065872362" Mar 18 13:23:24 crc kubenswrapper[4912]: I0318 13:23:24.271571 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podStartSLOduration=5.232356186 podStartE2EDuration="37.271537232s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.527430375 +0000 UTC m=+1218.986857800" lastFinishedPulling="2026-03-18 13:23:22.566611421 +0000 UTC m=+1251.026038846" observedRunningTime="2026-03-18 13:23:23.641229512 +0000 UTC m=+1252.100656937" watchObservedRunningTime="2026-03-18 13:23:24.271537232 +0000 UTC m=+1252.730964657" Mar 18 13:23:24 crc kubenswrapper[4912]: I0318 13:23:24.433360 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 13:23:24 crc kubenswrapper[4912]: I0318 13:23:24.529776 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.538508 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" event={"ID":"6afa3dcd-776b-4472-9e54-31e102d2fb67","Type":"ContainerStarted","Data":"984933cc67078f32c68a626439599558baa5f5b7414e6b677c6236ee51856a37"} Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.539110 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.539742 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" event={"ID":"1f17f2a1-55b9-493b-9a8a-3d53f21becb9","Type":"ContainerStarted","Data":"94e423b83257ecda2eecb455aa5efd7278789ae3f0c6971115e9ea22f3cf700c"} Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.539885 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.540822 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" event={"ID":"35ae7eba-4b8f-43ac-b828-5cbc84fed044","Type":"ContainerStarted","Data":"0b129c4536b6ab7c339f8d393dc4cdf863fe3b2376be95f20e3580d5c5088ccc"} Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.541185 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.572380 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podStartSLOduration=4.83193777 podStartE2EDuration="38.572357682s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.579562741 +0000 UTC m=+1219.038990166" lastFinishedPulling="2026-03-18 13:23:24.319982653 +0000 UTC m=+1252.779410078" observedRunningTime="2026-03-18 13:23:25.56817252 +0000 UTC m=+1254.027599965" watchObservedRunningTime="2026-03-18 13:23:25.572357682 +0000 UTC m=+1254.031785117" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.594664 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podStartSLOduration=4.462232363 podStartE2EDuration="38.594638421s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.577174916 +0000 UTC m=+1219.036602331" lastFinishedPulling="2026-03-18 13:23:24.709580964 +0000 UTC m=+1253.169008389" observedRunningTime="2026-03-18 13:23:25.587939211 +0000 UTC m=+1254.047366646" watchObservedRunningTime="2026-03-18 13:23:25.594638421 +0000 UTC m=+1254.054065846" Mar 18 13:23:25 crc kubenswrapper[4912]: I0318 13:23:25.614671 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podStartSLOduration=5.1618448279999996 podStartE2EDuration="38.614644279s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.869467944 +0000 UTC m=+1219.328895369" lastFinishedPulling="2026-03-18 13:23:24.322267395 +0000 UTC m=+1252.781694820" observedRunningTime="2026-03-18 13:23:25.60834433 +0000 UTC m=+1254.067771765" watchObservedRunningTime="2026-03-18 13:23:25.614644279 +0000 UTC m=+1254.074071704" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.358568 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.397348 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.455609 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.561172 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" event={"ID":"8b12ea77-cfde-4e3d-bdc7-04c350f17c09","Type":"ContainerStarted","Data":"8044c85abe572a622b2d8c72d5b31829f8a5f0c5d6d9a9c74b0b54f1c7b906f8"} Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.561536 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.564531 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" event={"ID":"334b170e-0f84-42b2-81a6-8c469d187fa3","Type":"ContainerStarted","Data":"a4b88a2e6f74a393beebd9f0978cd7ab5f4796b4198f0930f82123a5c688f5ce"} Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.564790 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.622163 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podStartSLOduration=4.503392574 podStartE2EDuration="40.622120893s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.523084097 +0000 UTC m=+1218.982511512" lastFinishedPulling="2026-03-18 13:23:26.641812396 +0000 UTC m=+1255.101239831" observedRunningTime="2026-03-18 13:23:27.619387839 +0000 UTC m=+1256.078815264" watchObservedRunningTime="2026-03-18 13:23:27.622120893 +0000 UTC m=+1256.081548318" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.624020 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.629138 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podStartSLOduration=4.319861911 podStartE2EDuration="40.62911231s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.522889032 +0000 UTC m=+1218.982316457" lastFinishedPulling="2026-03-18 13:23:26.832139391 +0000 UTC m=+1255.291566856" observedRunningTime="2026-03-18 13:23:27.600178683 +0000 UTC m=+1256.059606118" watchObservedRunningTime="2026-03-18 13:23:27.62911231 +0000 UTC m=+1256.088539735" Mar 18 13:23:27 crc kubenswrapper[4912]: I0318 13:23:27.760893 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.003503 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.098143 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.141877 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.328731 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.434488 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.534571 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.625673 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.746233 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 13:23:28 crc kubenswrapper[4912]: I0318 13:23:28.785719 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 13:23:29 crc kubenswrapper[4912]: I0318 13:23:29.585271 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" event={"ID":"e00a6814-84ad-42fc-a5c5-b629750cfa80","Type":"ContainerStarted","Data":"e59af2387a6da9b9b4ca771384ccaa58e1066f87702e8ae14bad08355f8b66e4"} Mar 18 13:23:29 crc kubenswrapper[4912]: I0318 13:23:29.606742 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pwv5l" podStartSLOduration=4.86019954 podStartE2EDuration="42.6067225s" podCreationTimestamp="2026-03-18 13:22:47 +0000 UTC" firstStartedPulling="2026-03-18 13:22:50.939154086 +0000 UTC m=+1219.398581511" lastFinishedPulling="2026-03-18 13:23:28.685677036 +0000 UTC m=+1257.145104471" observedRunningTime="2026-03-18 13:23:29.603094153 +0000 UTC m=+1258.062521578" watchObservedRunningTime="2026-03-18 13:23:29.6067225 +0000 UTC m=+1258.066149925" Mar 18 13:23:33 crc kubenswrapper[4912]: I0318 13:23:33.426993 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" Mar 18 13:23:33 crc kubenswrapper[4912]: I0318 13:23:33.937583 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 13:23:37 crc kubenswrapper[4912]: I0318 13:23:37.462025 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 13:23:37 crc kubenswrapper[4912]: I0318 13:23:37.857126 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 13:23:37 crc kubenswrapper[4912]: I0318 13:23:37.914540 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 13:23:38 crc kubenswrapper[4912]: I0318 13:23:38.040615 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 13:23:38 crc kubenswrapper[4912]: I0318 13:23:38.657845 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.050787 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.054004 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.060774 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4kch2" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.061921 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.062096 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.062219 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.073721 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.121814 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.122024 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74r74\" (UniqueName: \"kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.128896 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.131886 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.137381 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.148720 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.180052 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564004-9dzjl"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.181644 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.184924 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.185030 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.185318 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.191506 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-9dzjl"] Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.224988 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lczq\" (UniqueName: \"kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq\") pod \"auto-csr-approver-29564004-9dzjl\" (UID: \"9970cb0b-ff3a-4850-a612-03c861a5bbf4\") " pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.225072 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.225502 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.225672 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6mw\" (UniqueName: \"kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.225737 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.225826 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74r74\" (UniqueName: \"kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.227412 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.276156 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74r74\" (UniqueName: \"kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74\") pod \"dnsmasq-dns-675f4bcbfc-6c5r8\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.328162 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.328290 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6mw\" (UniqueName: \"kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.328464 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lczq\" (UniqueName: \"kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq\") pod \"auto-csr-approver-29564004-9dzjl\" (UID: \"9970cb0b-ff3a-4850-a612-03c861a5bbf4\") " pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.328516 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.329269 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.329716 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.349997 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lczq\" (UniqueName: \"kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq\") pod \"auto-csr-approver-29564004-9dzjl\" (UID: \"9970cb0b-ff3a-4850-a612-03c861a5bbf4\") " pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.350028 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6mw\" (UniqueName: \"kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw\") pod \"dnsmasq-dns-78dd6ddcc-wgstq\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.378298 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.450375 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:00 crc kubenswrapper[4912]: I0318 13:24:00.511156 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:00.891076 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:00.899263 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" event={"ID":"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e","Type":"ContainerStarted","Data":"84cde48f8c6dce5453cfc377e0aeb2f31293d35e548d8df8415f9ee22c807443"} Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:01.554051 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:01 crc kubenswrapper[4912]: W0318 13:24:01.557973 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8999bed8_e35f_4458_9ca7_03c5352b8f4a.slice/crio-8f49739a28d6f61da2a866c54dc9516f4d9099daba7e195a96a96d02c834dd24 WatchSource:0}: Error finding container 8f49739a28d6f61da2a866c54dc9516f4d9099daba7e195a96a96d02c834dd24: Status 404 returned error can't find the container with id 8f49739a28d6f61da2a866c54dc9516f4d9099daba7e195a96a96d02c834dd24 Mar 18 13:24:01 crc kubenswrapper[4912]: W0318 13:24:01.558635 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9970cb0b_ff3a_4850_a612_03c861a5bbf4.slice/crio-7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068 WatchSource:0}: Error finding container 7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068: Status 404 returned error can't find the container with id 7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068 Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:01.563573 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-9dzjl"] Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:01.918165 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" event={"ID":"8999bed8-e35f-4458-9ca7-03c5352b8f4a","Type":"ContainerStarted","Data":"8f49739a28d6f61da2a866c54dc9516f4d9099daba7e195a96a96d02c834dd24"} Mar 18 13:24:01 crc kubenswrapper[4912]: I0318 13:24:01.920673 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" event={"ID":"9970cb0b-ff3a-4850-a612-03c861a5bbf4","Type":"ContainerStarted","Data":"7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068"} Mar 18 13:24:02 crc kubenswrapper[4912]: I0318 13:24:02.961471 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:02.995158 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:02.997431 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.029742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.030213 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfvp\" (UniqueName: \"kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.030346 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.031670 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.139728 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.139816 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfvp\" (UniqueName: \"kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.139925 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.141001 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.154508 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.262444 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfvp\" (UniqueName: \"kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp\") pod \"dnsmasq-dns-5ccc8479f9-jm6sw\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.391239 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.395014 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.450294 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.455242 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.479809 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.479888 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.479932 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvhk2\" (UniqueName: \"kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.505859 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.637994 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.673937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.643471 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.674640 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvhk2\" (UniqueName: \"kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.676099 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.712942 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvhk2\" (UniqueName: \"kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2\") pod \"dnsmasq-dns-57d769cc4f-r7pms\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.945396 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:03 crc kubenswrapper[4912]: I0318 13:24:03.984891 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" event={"ID":"9970cb0b-ff3a-4850-a612-03c861a5bbf4","Type":"ContainerStarted","Data":"eb995ecb618883b37121108c37fef475a9529000104dd82d0d3c2796ef4890ae"} Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.027270 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" podStartSLOduration=2.687579353 podStartE2EDuration="4.027245338s" podCreationTimestamp="2026-03-18 13:24:00 +0000 UTC" firstStartedPulling="2026-03-18 13:24:01.561692184 +0000 UTC m=+1290.021119609" lastFinishedPulling="2026-03-18 13:24:02.901358169 +0000 UTC m=+1291.360785594" observedRunningTime="2026-03-18 13:24:04.008027231 +0000 UTC m=+1292.467454666" watchObservedRunningTime="2026-03-18 13:24:04.027245338 +0000 UTC m=+1292.486672753" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.130004 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.134473 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.153482 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-plsnj" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.154142 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.154983 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.155112 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.154986 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.155586 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.163673 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.180656 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.296835 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5w99\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297198 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297317 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297432 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297504 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297568 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297643 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297731 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297804 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297868 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.297937 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.316446 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.414735 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5w99\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.414815 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.414857 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.414942 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.414982 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415004 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415058 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415116 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415138 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415175 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415196 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.415619 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.419260 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.419315 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.419893 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.420252 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.424591 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.424695 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.424948 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.442289 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.448972 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5w99\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.460934 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.461013 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e1887cd5a1a70a198c286e0ab70ed1f31939a231341948609d08078c9331ef7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.562012 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.592010 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.594974 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600009 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600236 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600280 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600455 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600528 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lrnbc" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.600596 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.612201 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.618710 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.632645 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.634883 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.666012 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.668561 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.693249 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723847 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723893 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723952 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723970 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.723993 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724046 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724071 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724089 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724109 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76x4z\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724130 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgpx\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724159 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724177 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724196 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724235 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724292 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724317 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724346 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724407 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724815 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.724866 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: W0318 13:24:04.727099 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6144ea19_b46e_449b_85a2_89be0d315561.slice/crio-6b7ba42edaae166542710e1744d4033bb27b5c0fbe4801d1cb2ff50b9e72e24b WatchSource:0}: Error finding container 6b7ba42edaae166542710e1744d4033bb27b5c0fbe4801d1cb2ff50b9e72e24b: Status 404 returned error can't find the container with id 6b7ba42edaae166542710e1744d4033bb27b5c0fbe4801d1cb2ff50b9e72e24b Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.746016 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.760776 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.790281 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826783 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826848 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826919 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826955 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.826980 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827009 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827061 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827092 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827115 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827134 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827153 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76x4z\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827177 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgpx\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827211 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827233 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827250 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827285 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827307 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827326 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827352 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827371 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827391 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827418 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827437 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827466 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827492 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827509 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827532 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6tb\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827558 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827590 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827609 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827639 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827668 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.827901 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.828266 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.828553 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.829295 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.829414 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.829827 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.830723 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.831284 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.831314 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8eb5ba5613cad0584a090ab94d3d93f59f943bdbf827f567e324c9dfa87263aa/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.831896 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.837108 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.837910 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.840018 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.840669 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.841250 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.841581 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.845470 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.846696 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.847210 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.847994 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.848087 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa201d4246c012acd5c41babb6a867c820a838e5b585b432c358397deecb8fad/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.849956 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.851787 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgpx\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.856732 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76x4z\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.883094 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.892816 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " pod="openstack/rabbitmq-server-1" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.931027 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935579 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935654 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935710 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935755 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935776 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935799 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6tb\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935858 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935890 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935936 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.935982 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.933423 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.943968 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.945460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.945460 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.946232 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.946956 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.947029 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.947512 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.951352 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.951396 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5aa3e0c5b60af900e5d16575c4249b7a6e9c51067aa146eb784e89b8270a4c65/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.953102 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.965509 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6tb\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.991098 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " pod="openstack/rabbitmq-server-2" Mar 18 13:24:04 crc kubenswrapper[4912]: I0318 13:24:04.993732 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.017553 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.032853 4912 generic.go:334] "Generic (PLEG): container finished" podID="9970cb0b-ff3a-4850-a612-03c861a5bbf4" containerID="eb995ecb618883b37121108c37fef475a9529000104dd82d0d3c2796ef4890ae" exitCode=0 Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.033128 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" event={"ID":"9970cb0b-ff3a-4850-a612-03c861a5bbf4","Type":"ContainerDied","Data":"eb995ecb618883b37121108c37fef475a9529000104dd82d0d3c2796ef4890ae"} Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.033646 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.046977 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" event={"ID":"6144ea19-b46e-449b-85a2-89be0d315561","Type":"ContainerStarted","Data":"6b7ba42edaae166542710e1744d4033bb27b5c0fbe4801d1cb2ff50b9e72e24b"} Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.050030 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" event={"ID":"416535c8-1b0c-4a0a-a789-0223c011a4bb","Type":"ContainerStarted","Data":"052d30a92973759b276844c1b3d69e2fd9a2291b61facf441f66a5cb118418a7"} Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.358552 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.405294 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.410745 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.416627 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qgvsp" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.418160 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.443351 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.445048 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.445296 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.448269 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.610343 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613453 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613501 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613745 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613804 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613842 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613904 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2n4v\" (UniqueName: \"kubernetes.io/projected/d0973556-9c2c-4037-b800-d11ecf1904cc-kube-api-access-x2n4v\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.613990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.725956 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.726026 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.726095 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2n4v\" (UniqueName: \"kubernetes.io/projected/d0973556-9c2c-4037-b800-d11ecf1904cc-kube-api-access-x2n4v\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.726184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.726608 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.729783 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.729849 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.730678 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.730734 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6dfe48b91f4a73bdfe7be84c7cb44519a59f50ea8df625b99a6856462bb5f251/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.731951 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.729817 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.732139 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.732558 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.733002 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0973556-9c2c-4037-b800-d11ecf1904cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.737251 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.757218 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0973556-9c2c-4037-b800-d11ecf1904cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.761933 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2n4v\" (UniqueName: \"kubernetes.io/projected/d0973556-9c2c-4037-b800-d11ecf1904cc-kube-api-access-x2n4v\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:05 crc kubenswrapper[4912]: I0318 13:24:05.800645 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bb35f3-3431-4f7a-b7c3-8335d20f8e41\") pod \"openstack-galera-0\" (UID: \"d0973556-9c2c-4037-b800-d11ecf1904cc\") " pod="openstack/openstack-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.031399 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:24:06 crc kubenswrapper[4912]: W0318 13:24:06.045765 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b795fb_bb07_4401_8e18_0b826303b4ba.slice/crio-0e85baf20994fcef0c477cb3dd9bdfc3baa9ab328d7579b9e6e14f3562cb4d7d WatchSource:0}: Error finding container 0e85baf20994fcef0c477cb3dd9bdfc3baa9ab328d7579b9e6e14f3562cb4d7d: Status 404 returned error can't find the container with id 0e85baf20994fcef0c477cb3dd9bdfc3baa9ab328d7579b9e6e14f3562cb4d7d Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.069557 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerStarted","Data":"0e85baf20994fcef0c477cb3dd9bdfc3baa9ab328d7579b9e6e14f3562cb4d7d"} Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.071870 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerStarted","Data":"729845b90d8dc95720368f0bd984e68203789c173fb352c3f3906c683bad49ed"} Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.076586 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.123889 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.146690 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:24:06 crc kubenswrapper[4912]: W0318 13:24:06.203998 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25d0c6c_24af_4cb6_b961_ae312ec23df9.slice/crio-b9ad57e4bc13d286c13ff050cb1e331aef87770cd4bcf9771cfd32e16458286e WatchSource:0}: Error finding container b9ad57e4bc13d286c13ff050cb1e331aef87770cd4bcf9771cfd32e16458286e: Status 404 returned error can't find the container with id b9ad57e4bc13d286c13ff050cb1e331aef87770cd4bcf9771cfd32e16458286e Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.696659 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.730927 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:24:06 crc kubenswrapper[4912]: E0318 13:24:06.731692 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9970cb0b-ff3a-4850-a612-03c861a5bbf4" containerName="oc" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.731717 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9970cb0b-ff3a-4850-a612-03c861a5bbf4" containerName="oc" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.731893 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9970cb0b-ff3a-4850-a612-03c861a5bbf4" containerName="oc" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.733377 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.737450 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.737700 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ww9w8" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.737784 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.737811 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.771653 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868392 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lczq\" (UniqueName: \"kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq\") pod \"9970cb0b-ff3a-4850-a612-03c861a5bbf4\" (UID: \"9970cb0b-ff3a-4850-a612-03c861a5bbf4\") " Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868770 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868822 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77736799-2ebe-4076-9717-6741aed93599-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868867 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdjl\" (UniqueName: \"kubernetes.io/projected/77736799-2ebe-4076-9717-6741aed93599-kube-api-access-csdjl\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868930 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.868994 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.869050 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.869074 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.869163 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.899012 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq" (OuterVolumeSpecName: "kube-api-access-9lczq") pod "9970cb0b-ff3a-4850-a612-03c861a5bbf4" (UID: "9970cb0b-ff3a-4850-a612-03c861a5bbf4"). InnerVolumeSpecName "kube-api-access-9lczq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.963453 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972284 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdjl\" (UniqueName: \"kubernetes.io/projected/77736799-2ebe-4076-9717-6741aed93599-kube-api-access-csdjl\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972492 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972620 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972682 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972717 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972906 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.972964 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.973105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77736799-2ebe-4076-9717-6741aed93599-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.973216 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lczq\" (UniqueName: \"kubernetes.io/projected/9970cb0b-ff3a-4850-a612-03c861a5bbf4-kube-api-access-9lczq\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.973767 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77736799-2ebe-4076-9717-6741aed93599-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.975455 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.976176 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.977542 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77736799-2ebe-4076-9717-6741aed93599-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.986014 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.986105 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/251a29eee129d2665eb275750b495e6ed6a5605610b7687c57ad93fbc922bf10/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:06 crc kubenswrapper[4912]: I0318 13:24:06.991448 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.000855 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.000923 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.002392 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77736799-2ebe-4076-9717-6741aed93599-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.018658 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdjl\" (UniqueName: \"kubernetes.io/projected/77736799-2ebe-4076-9717-6741aed93599-kube-api-access-csdjl\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:07 crc kubenswrapper[4912]: W0318 13:24:07.052727 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0973556_9c2c_4037_b800_d11ecf1904cc.slice/crio-ad8fb72f7f9290214d82144c65353571af0b737dcb23c752e285169b6e5a5108 WatchSource:0}: Error finding container ad8fb72f7f9290214d82144c65353571af0b737dcb23c752e285169b6e5a5108: Status 404 returned error can't find the container with id ad8fb72f7f9290214d82144c65353571af0b737dcb23c752e285169b6e5a5108 Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.087686 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fa676e9-f3bf-438d-be68-50e249f1af0f\") pod \"openstack-cell1-galera-0\" (UID: \"77736799-2ebe-4076-9717-6741aed93599\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.109024 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" event={"ID":"9970cb0b-ff3a-4850-a612-03c861a5bbf4","Type":"ContainerDied","Data":"7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068"} Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.109093 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c37bac891aa6fb83743c249d26990769d4be7df59436568c42460b990355068" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.109167 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-9dzjl" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.113454 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerStarted","Data":"c459d755036419f0da01ec84540face151649cffb8107c677ddd6d314f2c9481"} Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.117983 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.120294 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.121274 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerStarted","Data":"b9ad57e4bc13d286c13ff050cb1e331aef87770cd4bcf9771cfd32e16458286e"} Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.127434 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.127932 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.128175 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wkm2p" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.145191 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.170730 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-cjg4w"] Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.189677 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-cjg4w"] Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.279852 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.280465 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2whl\" (UniqueName: \"kubernetes.io/projected/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kube-api-access-g2whl\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.280524 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.280565 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-config-data\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.280659 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kolla-config\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.368367 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.382193 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-config-data\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.382322 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kolla-config\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.382474 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.382530 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2whl\" (UniqueName: \"kubernetes.io/projected/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kube-api-access-g2whl\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.382565 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.384192 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kolla-config\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.384887 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f35e63e-80c3-4dca-b383-9650e3aa63a2-config-data\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.390769 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.390822 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f35e63e-80c3-4dca-b383-9650e3aa63a2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.407341 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2whl\" (UniqueName: \"kubernetes.io/projected/5f35e63e-80c3-4dca-b383-9650e3aa63a2-kube-api-access-g2whl\") pod \"memcached-0\" (UID: \"5f35e63e-80c3-4dca-b383-9650e3aa63a2\") " pod="openstack/memcached-0" Mar 18 13:24:07 crc kubenswrapper[4912]: I0318 13:24:07.495346 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 13:24:08 crc kubenswrapper[4912]: I0318 13:24:08.228950 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerStarted","Data":"ad8fb72f7f9290214d82144c65353571af0b737dcb23c752e285169b6e5a5108"} Mar 18 13:24:08 crc kubenswrapper[4912]: I0318 13:24:08.267905 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9395a26a-9b3b-4280-adc8-5d8d4f193d5f" path="/var/lib/kubelet/pods/9395a26a-9b3b-4280-adc8-5d8d4f193d5f/volumes" Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.687469 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.698112 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.705605 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hk5cn" Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.710635 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.875990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b544\" (UniqueName: \"kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544\") pod \"kube-state-metrics-0\" (UID: \"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5\") " pod="openstack/kube-state-metrics-0" Mar 18 13:24:09 crc kubenswrapper[4912]: I0318 13:24:09.983603 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b544\" (UniqueName: \"kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544\") pod \"kube-state-metrics-0\" (UID: \"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5\") " pod="openstack/kube-state-metrics-0" Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.047788 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b544\" (UniqueName: \"kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544\") pod \"kube-state-metrics-0\" (UID: \"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5\") " pod="openstack/kube-state-metrics-0" Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.339836 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.857870 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42"] Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.859868 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.879746 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42"] Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.885771 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-zjk2q" Mar 18 13:24:10 crc kubenswrapper[4912]: I0318 13:24:10.886088 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.016840 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.016952 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kssk\" (UniqueName: \"kubernetes.io/projected/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-kube-api-access-2kssk\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.124383 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kssk\" (UniqueName: \"kubernetes.io/projected/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-kube-api-access-2kssk\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.124711 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.146215 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.155647 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kssk\" (UniqueName: \"kubernetes.io/projected/be9dbd3b-a78d-4306-b834-3cd7c60d7d05-kube-api-access-2kssk\") pod \"observability-ui-dashboards-7f87b9b85b-w8j42\" (UID: \"be9dbd3b-a78d-4306-b834-3cd7c60d7d05\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.210665 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.216260 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c665c8f96-4wcs8"] Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.219815 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.245989 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c665c8f96-4wcs8"] Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.259810 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-oauth-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260011 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-trusted-ca-bundle\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260130 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-console-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260233 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-service-ca\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260388 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260554 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2qt\" (UniqueName: \"kubernetes.io/projected/0050439a-3a22-49ef-8b64-4fb98592d68b-kube-api-access-lv2qt\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.260608 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-oauth-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.345936 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.351325 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.354611 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.355633 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.367549 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-console-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.367809 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-service-ca\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.368208 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.368324 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2qt\" (UniqueName: \"kubernetes.io/projected/0050439a-3a22-49ef-8b64-4fb98592d68b-kube-api-access-lv2qt\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372108 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-oauth-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372248 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372328 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-oauth-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372486 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-trusted-ca-bundle\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372664 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372815 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.372940 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.373127 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l5hz4" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.373369 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.374060 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-trusted-ca-bundle\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.374745 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-console-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.375449 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-service-ca\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.384255 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0050439a-3a22-49ef-8b64-4fb98592d68b-oauth-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.387280 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-serving-cert\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.387766 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0050439a-3a22-49ef-8b64-4fb98592d68b-console-oauth-config\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.396815 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.409153 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2qt\" (UniqueName: \"kubernetes.io/projected/0050439a-3a22-49ef-8b64-4fb98592d68b-kube-api-access-lv2qt\") pod \"console-c665c8f96-4wcs8\" (UID: \"0050439a-3a22-49ef-8b64-4fb98592d68b\") " pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475209 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475308 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475358 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475409 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475497 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475550 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475618 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxcc\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.475706 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.476025 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.476147 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578251 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578286 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578327 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klxcc\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578374 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578444 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578476 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578516 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578542 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.578562 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.580980 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.581380 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.581654 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.591954 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.593088 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.593337 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.593356 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.594299 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.594337 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/762887b1ff77fd04cff9a8d1f3f0d3bfb1e91ae8558b3b5a91b139edd2c848bf/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.594719 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.600587 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxcc\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.632147 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.677474 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:11 crc kubenswrapper[4912]: I0318 13:24:11.784294 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.284010 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7nd97"] Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.286318 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.291518 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vhkpt" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.292193 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.292220 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.333190 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97"] Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.375189 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tbb6v"] Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.378250 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.403887 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tbb6v"] Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.407741 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.409146 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.409258 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-log-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.410089 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5353be6e-99f8-4367-a237-99e0bd3bab04-scripts\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.410299 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-combined-ca-bundle\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.410485 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlf9g\" (UniqueName: \"kubernetes.io/projected/5353be6e-99f8-4367-a237-99e0bd3bab04-kube-api-access-hlf9g\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.410583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-ovn-controller-tls-certs\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512597 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-combined-ca-bundle\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512766 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlf9g\" (UniqueName: \"kubernetes.io/projected/5353be6e-99f8-4367-a237-99e0bd3bab04-kube-api-access-hlf9g\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512816 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-etc-ovs\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512840 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-ovn-controller-tls-certs\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512869 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-scripts\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.512916 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-run\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.514504 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.513623 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.514648 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2m4\" (UniqueName: \"kubernetes.io/projected/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-kube-api-access-xz2m4\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.514788 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.514922 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-log-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.515084 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-run-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.515604 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5353be6e-99f8-4367-a237-99e0bd3bab04-var-log-ovn\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.517162 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-log\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.517525 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-combined-ca-bundle\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.518545 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5353be6e-99f8-4367-a237-99e0bd3bab04-scripts\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.521683 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5353be6e-99f8-4367-a237-99e0bd3bab04-ovn-controller-tls-certs\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.525722 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5353be6e-99f8-4367-a237-99e0bd3bab04-scripts\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.525921 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-lib\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.530261 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlf9g\" (UniqueName: \"kubernetes.io/projected/5353be6e-99f8-4367-a237-99e0bd3bab04-kube-api-access-hlf9g\") pod \"ovn-controller-7nd97\" (UID: \"5353be6e-99f8-4367-a237-99e0bd3bab04\") " pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.626258 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.627810 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-lib\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.627928 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-etc-ovs\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.627962 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-scripts\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628000 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-run\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628050 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2m4\" (UniqueName: \"kubernetes.io/projected/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-kube-api-access-xz2m4\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628073 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-lib\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-log\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628162 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-run\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628281 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-etc-ovs\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.628520 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-var-log\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.630541 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-scripts\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.654145 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2m4\" (UniqueName: \"kubernetes.io/projected/2524b573-8f88-4fd6-8b1d-c3a4f39e0620-kube-api-access-xz2m4\") pod \"ovn-controller-ovs-tbb6v\" (UID: \"2524b573-8f88-4fd6-8b1d-c3a4f39e0620\") " pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:12 crc kubenswrapper[4912]: I0318 13:24:12.710373 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.157267 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.160651 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.165683 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.165875 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-452sk" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.166211 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.166446 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.171261 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.179786 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286020 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286192 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286227 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286264 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286294 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fd40eb88-784b-4a5c-9744-de840d98598f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd40eb88-784b-4a5c-9744-de840d98598f\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286326 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286380 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7qp\" (UniqueName: \"kubernetes.io/projected/b193ddb0-beb0-47c2-80c0-a301e580d2b1-kube-api-access-ht7qp\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.286906 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389011 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389094 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389142 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389173 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fd40eb88-784b-4a5c-9744-de840d98598f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd40eb88-784b-4a5c-9744-de840d98598f\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389204 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389231 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7qp\" (UniqueName: \"kubernetes.io/projected/b193ddb0-beb0-47c2-80c0-a301e580d2b1-kube-api-access-ht7qp\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389269 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389343 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.389722 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.390308 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.390909 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b193ddb0-beb0-47c2-80c0-a301e580d2b1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.395431 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.395478 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fd40eb88-784b-4a5c-9744-de840d98598f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd40eb88-784b-4a5c-9744-de840d98598f\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0cf714bb1527541641c6025d09bde236a66bf29e2cdd6fbb6ab84025c52ce859/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.395624 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.401151 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.401896 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b193ddb0-beb0-47c2-80c0-a301e580d2b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.417501 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7qp\" (UniqueName: \"kubernetes.io/projected/b193ddb0-beb0-47c2-80c0-a301e580d2b1-kube-api-access-ht7qp\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.446334 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fd40eb88-784b-4a5c-9744-de840d98598f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd40eb88-784b-4a5c-9744-de840d98598f\") pod \"ovsdbserver-sb-0\" (UID: \"b193ddb0-beb0-47c2-80c0-a301e580d2b1\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:14 crc kubenswrapper[4912]: I0318 13:24:14.501053 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:15 crc kubenswrapper[4912]: I0318 13:24:15.811865 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.664686 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.668482 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.671026 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.671332 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.671482 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xxctd" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.671877 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.678407 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.777818 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.777910 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbmf\" (UniqueName: \"kubernetes.io/projected/6bc55c08-667c-4803-86d4-e30cf29b4bb6-kube-api-access-jfbmf\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778053 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-config\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778324 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778404 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778570 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778650 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.778717 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: W0318 13:24:16.785919 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77736799_2ebe_4076_9717_6741aed93599.slice/crio-8e1ec5bd50cb63146ee3b6a6138f3979e6fe9915e5c8c58b1c187110aa028e28 WatchSource:0}: Error finding container 8e1ec5bd50cb63146ee3b6a6138f3979e6fe9915e5c8c58b1c187110aa028e28: Status 404 returned error can't find the container with id 8e1ec5bd50cb63146ee3b6a6138f3979e6fe9915e5c8c58b1c187110aa028e28 Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.881645 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.881790 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.881841 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.881907 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.881968 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbmf\" (UniqueName: \"kubernetes.io/projected/6bc55c08-667c-4803-86d4-e30cf29b4bb6-kube-api-access-jfbmf\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.882077 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-config\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.882126 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.882147 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.883323 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.883875 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.884358 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc55c08-667c-4803-86d4-e30cf29b4bb6-config\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.886704 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.887298 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a0112d26213a5cbd5d4251c67976c919ffb8d6b1ffea39408b60d22964001a2e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.891909 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.893979 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.898074 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc55c08-667c-4803-86d4-e30cf29b4bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.903502 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbmf\" (UniqueName: \"kubernetes.io/projected/6bc55c08-667c-4803-86d4-e30cf29b4bb6-kube-api-access-jfbmf\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:16 crc kubenswrapper[4912]: I0318 13:24:16.932407 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d091ca48-49e1-4067-a102-7a2ce3c0f252\") pod \"ovsdbserver-nb-0\" (UID: \"6bc55c08-667c-4803-86d4-e30cf29b4bb6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:17 crc kubenswrapper[4912]: I0318 13:24:17.009341 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:17 crc kubenswrapper[4912]: I0318 13:24:17.511981 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerStarted","Data":"8e1ec5bd50cb63146ee3b6a6138f3979e6fe9915e5c8c58b1c187110aa028e28"} Mar 18 13:24:20 crc kubenswrapper[4912]: I0318 13:24:20.916719 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:24:21 crc kubenswrapper[4912]: I0318 13:24:21.292812 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 13:24:21 crc kubenswrapper[4912]: I0318 13:24:21.323976 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42"] Mar 18 13:24:21 crc kubenswrapper[4912]: I0318 13:24:21.333072 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c665c8f96-4wcs8"] Mar 18 13:24:22 crc kubenswrapper[4912]: I0318 13:24:22.218496 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tbb6v"] Mar 18 13:24:22 crc kubenswrapper[4912]: I0318 13:24:22.899644 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:24:25 crc kubenswrapper[4912]: I0318 13:24:25.751647 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97"] Mar 18 13:24:29 crc kubenswrapper[4912]: W0318 13:24:29.685342 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73069f34_9c0b_4204_a2f3_8b283232ce86.slice/crio-5a30928ffa23aea28118024b95923d71356d6df1864b6e4c7e30f5adf0bc5bfe WatchSource:0}: Error finding container 5a30928ffa23aea28118024b95923d71356d6df1864b6e4c7e30f5adf0bc5bfe: Status 404 returned error can't find the container with id 5a30928ffa23aea28118024b95923d71356d6df1864b6e4c7e30f5adf0bc5bfe Mar 18 13:24:29 crc kubenswrapper[4912]: W0318 13:24:29.691392 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f35e63e_80c3_4dca_b383_9650e3aa63a2.slice/crio-aa2a8cef67cba04e1ae6b1955c42d6d5c3ea2a5040cc5a54607b80a92c1334bc WatchSource:0}: Error finding container aa2a8cef67cba04e1ae6b1955c42d6d5c3ea2a5040cc5a54607b80a92c1334bc: Status 404 returned error can't find the container with id aa2a8cef67cba04e1ae6b1955c42d6d5c3ea2a5040cc5a54607b80a92c1334bc Mar 18 13:24:29 crc kubenswrapper[4912]: W0318 13:24:29.694382 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0050439a_3a22_49ef_8b64_4fb98592d68b.slice/crio-6cc9b87455c7844495deb673636148c055fa813b5b7877be35eae3baa8f5e1ce WatchSource:0}: Error finding container 6cc9b87455c7844495deb673636148c055fa813b5b7877be35eae3baa8f5e1ce: Status 404 returned error can't find the container with id 6cc9b87455c7844495deb673636148c055fa813b5b7877be35eae3baa8f5e1ce Mar 18 13:24:29 crc kubenswrapper[4912]: W0318 13:24:29.696248 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe9dbd3b_a78d_4306_b834_3cd7c60d7d05.slice/crio-6848e4e172a9fd6dff940e0db120b4d2fd32afd33dc26a5ecbf3c9dfd0e50a62 WatchSource:0}: Error finding container 6848e4e172a9fd6dff940e0db120b4d2fd32afd33dc26a5ecbf3c9dfd0e50a62: Status 404 returned error can't find the container with id 6848e4e172a9fd6dff940e0db120b4d2fd32afd33dc26a5ecbf3c9dfd0e50a62 Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.634966 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.636018 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd6mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wgstq_openstack(8999bed8-e35f-4458-9ca7-03c5352b8f4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.637274 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" podUID="8999bed8-e35f-4458-9ca7-03c5352b8f4a" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.638617 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.638829 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxfvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-jm6sw_openstack(416535c8-1b0c-4a0a-a789-0223c011a4bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.640541 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.647928 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.648160 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvhk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-r7pms_openstack(6144ea19-b46e-449b-85a2-89be0d315561): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.649312 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" podUID="6144ea19-b46e-449b-85a2-89be0d315561" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.664561 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.664694 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74r74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6c5r8_openstack(5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.666213 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" podUID="5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e" Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.678392 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97" event={"ID":"5353be6e-99f8-4367-a237-99e0bd3bab04","Type":"ContainerStarted","Data":"8fd78766af8f899a1643276466ad642cc6daa90f0f8784605479bac1e4c9efed"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.680836 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c665c8f96-4wcs8" event={"ID":"0050439a-3a22-49ef-8b64-4fb98592d68b","Type":"ContainerStarted","Data":"6cc9b87455c7844495deb673636148c055fa813b5b7877be35eae3baa8f5e1ce"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.682518 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b193ddb0-beb0-47c2-80c0-a301e580d2b1","Type":"ContainerStarted","Data":"46fad7755a983276e25c9ef74aca071cc888b5fe51eccd45bf005c4a6c46cc43"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.700695 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbb6v" event={"ID":"2524b573-8f88-4fd6-8b1d-c3a4f39e0620","Type":"ContainerStarted","Data":"288033d3499b074d57a819ee5e3a04a0c3724103149c2233dc2b31179bc9a5f5"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.717968 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerStarted","Data":"5a30928ffa23aea28118024b95923d71356d6df1864b6e4c7e30f5adf0bc5bfe"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.723731 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5f35e63e-80c3-4dca-b383-9650e3aa63a2","Type":"ContainerStarted","Data":"aa2a8cef67cba04e1ae6b1955c42d6d5c3ea2a5040cc5a54607b80a92c1334bc"} Mar 18 13:24:30 crc kubenswrapper[4912]: I0318 13:24:30.748743 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" event={"ID":"be9dbd3b-a78d-4306-b834-3cd7c60d7d05","Type":"ContainerStarted","Data":"6848e4e172a9fd6dff940e0db120b4d2fd32afd33dc26a5ecbf3c9dfd0e50a62"} Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.756911 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" Mar 18 13:24:30 crc kubenswrapper[4912]: E0318 13:24:30.757030 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" podUID="6144ea19-b46e-449b-85a2-89be0d315561" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.502618 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.553305 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.641203 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.704605 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config\") pod \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.704695 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd6mw\" (UniqueName: \"kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw\") pod \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.704744 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc\") pod \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\" (UID: \"8999bed8-e35f-4458-9ca7-03c5352b8f4a\") " Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.705194 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config" (OuterVolumeSpecName: "config") pod "8999bed8-e35f-4458-9ca7-03c5352b8f4a" (UID: "8999bed8-e35f-4458-9ca7-03c5352b8f4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.705262 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8999bed8-e35f-4458-9ca7-03c5352b8f4a" (UID: "8999bed8-e35f-4458-9ca7-03c5352b8f4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.706176 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.706215 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8999bed8-e35f-4458-9ca7-03c5352b8f4a-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.734387 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw" (OuterVolumeSpecName: "kube-api-access-dd6mw") pod "8999bed8-e35f-4458-9ca7-03c5352b8f4a" (UID: "8999bed8-e35f-4458-9ca7-03c5352b8f4a"). InnerVolumeSpecName "kube-api-access-dd6mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.762334 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" event={"ID":"8999bed8-e35f-4458-9ca7-03c5352b8f4a","Type":"ContainerDied","Data":"8f49739a28d6f61da2a866c54dc9516f4d9099daba7e195a96a96d02c834dd24"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.762357 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wgstq" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.766795 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5","Type":"ContainerStarted","Data":"5f0835c7937622b13d93069ff7ad874f11e64c220f5ecb71ce17fa4a8f68cdcb"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.769911 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerStarted","Data":"4aa7932c0f0aa1f6453b1d60126cd68e47bfeda35dd23817406fbe092afd9667"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.772854 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerStarted","Data":"1abf374a6da7cd2d7c849b32dee645b7366700158c66e49f17cdb829d647f303"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.775655 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6bc55c08-667c-4803-86d4-e30cf29b4bb6","Type":"ContainerStarted","Data":"6af36eb9c3df5f6fc6eefe9bf4eaf23611970fcf5f463524be7861e4c246399c"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.778579 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c665c8f96-4wcs8" event={"ID":"0050439a-3a22-49ef-8b64-4fb98592d68b","Type":"ContainerStarted","Data":"902fe890ebba0eba046de3dd16b6f2a5f9b183abc29577c612d48926a743e605"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.781372 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" event={"ID":"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e","Type":"ContainerDied","Data":"84cde48f8c6dce5453cfc377e0aeb2f31293d35e548d8df8415f9ee22c807443"} Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.781402 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84cde48f8c6dce5453cfc377e0aeb2f31293d35e548d8df8415f9ee22c807443" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.808424 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd6mw\" (UniqueName: \"kubernetes.io/projected/8999bed8-e35f-4458-9ca7-03c5352b8f4a-kube-api-access-dd6mw\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:31 crc kubenswrapper[4912]: I0318 13:24:31.818556 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c665c8f96-4wcs8" podStartSLOduration=20.818528992 podStartE2EDuration="20.818528992s" podCreationTimestamp="2026-03-18 13:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:31.812019527 +0000 UTC m=+1320.271446972" watchObservedRunningTime="2026-03-18 13:24:31.818528992 +0000 UTC m=+1320.277956417" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.251280 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.435857 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config\") pod \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.436097 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74r74\" (UniqueName: \"kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74\") pod \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\" (UID: \"5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e\") " Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.436882 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config" (OuterVolumeSpecName: "config") pod "5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e" (UID: "5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.439630 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.446918 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.466917 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wgstq"] Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.531417 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74" (OuterVolumeSpecName: "kube-api-access-74r74") pod "5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e" (UID: "5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e"). InnerVolumeSpecName "kube-api-access-74r74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.543465 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74r74\" (UniqueName: \"kubernetes.io/projected/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e-kube-api-access-74r74\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.795379 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerStarted","Data":"1d2a166924f5b4f3e488f2ea7f00fb05c68b56c294cea47dc73fb81918b8845b"} Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.795433 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6c5r8" Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.973186 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:32 crc kubenswrapper[4912]: I0318 13:24:32.983849 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6c5r8"] Mar 18 13:24:33 crc kubenswrapper[4912]: I0318 13:24:33.810542 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerStarted","Data":"44d2c20bf933e8a8a31df1480a2142c6fd3cd3bffe7c9d96198e1527e8ce82b2"} Mar 18 13:24:33 crc kubenswrapper[4912]: I0318 13:24:33.813531 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerStarted","Data":"89b9b0744f0defd6a923dad45fc73dbb7c753a1d60556a747054c707ca6490fd"} Mar 18 13:24:33 crc kubenswrapper[4912]: I0318 13:24:33.816151 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerStarted","Data":"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1"} Mar 18 13:24:34 crc kubenswrapper[4912]: I0318 13:24:34.191376 4912 scope.go:117] "RemoveContainer" containerID="786a714477551d9d646014e23a3cad3d41f11842b7c0ed76b3216e04a5ddf5d5" Mar 18 13:24:34 crc kubenswrapper[4912]: I0318 13:24:34.243296 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e" path="/var/lib/kubelet/pods/5b1f68fb-3edd-4fd3-ac76-5d201c9f2d5e/volumes" Mar 18 13:24:34 crc kubenswrapper[4912]: I0318 13:24:34.243713 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8999bed8-e35f-4458-9ca7-03c5352b8f4a" path="/var/lib/kubelet/pods/8999bed8-e35f-4458-9ca7-03c5352b8f4a/volumes" Mar 18 13:24:34 crc kubenswrapper[4912]: I0318 13:24:34.829640 4912 generic.go:334] "Generic (PLEG): container finished" podID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerID="4aa7932c0f0aa1f6453b1d60126cd68e47bfeda35dd23817406fbe092afd9667" exitCode=0 Mar 18 13:24:34 crc kubenswrapper[4912]: I0318 13:24:34.830899 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerDied","Data":"4aa7932c0f0aa1f6453b1d60126cd68e47bfeda35dd23817406fbe092afd9667"} Mar 18 13:24:35 crc kubenswrapper[4912]: I0318 13:24:35.843136 4912 generic.go:334] "Generic (PLEG): container finished" podID="77736799-2ebe-4076-9717-6741aed93599" containerID="1abf374a6da7cd2d7c849b32dee645b7366700158c66e49f17cdb829d647f303" exitCode=0 Mar 18 13:24:35 crc kubenswrapper[4912]: I0318 13:24:35.843217 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerDied","Data":"1abf374a6da7cd2d7c849b32dee645b7366700158c66e49f17cdb829d647f303"} Mar 18 13:24:37 crc kubenswrapper[4912]: I0318 13:24:36.999459 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:24:37 crc kubenswrapper[4912]: I0318 13:24:37.000136 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:24:38 crc kubenswrapper[4912]: I0318 13:24:38.889984 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerStarted","Data":"2c12695758a8e030ab1bbae40231e9e1240ab28ee6b2233948263853940f7e9f"} Mar 18 13:24:38 crc kubenswrapper[4912]: I0318 13:24:38.926482 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.012629978 podStartE2EDuration="33.926450743s" podCreationTimestamp="2026-03-18 13:24:05 +0000 UTC" firstStartedPulling="2026-03-18 13:24:16.790713887 +0000 UTC m=+1305.250141312" lastFinishedPulling="2026-03-18 13:24:30.704534652 +0000 UTC m=+1319.163962077" observedRunningTime="2026-03-18 13:24:38.92002562 +0000 UTC m=+1327.379453055" watchObservedRunningTime="2026-03-18 13:24:38.926450743 +0000 UTC m=+1327.385878178" Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.905741 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97" event={"ID":"5353be6e-99f8-4367-a237-99e0bd3bab04","Type":"ContainerStarted","Data":"b47a2f2bb159877d2a5132c9e7acab2902c43ab4eea6674fc8aca7404a5c5c05"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.907793 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7nd97" Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.912735 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6bc55c08-667c-4803-86d4-e30cf29b4bb6","Type":"ContainerStarted","Data":"fdfc3eefc74ca3bdf53d4dbf96c941bdff91d1f9f88d4f2234debb136c1926a0"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.915875 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b193ddb0-beb0-47c2-80c0-a301e580d2b1","Type":"ContainerStarted","Data":"f9a30f4db0f5b6ac7c07b045aeb98887c94fc6c4cec8832c1dd86c6e471b8934"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.919522 4912 generic.go:334] "Generic (PLEG): container finished" podID="2524b573-8f88-4fd6-8b1d-c3a4f39e0620" containerID="35f7a3a17a8886a6038f747c4c1a8e163a57f785ea3fe454b176a394f712ec0d" exitCode=0 Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.919621 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbb6v" event={"ID":"2524b573-8f88-4fd6-8b1d-c3a4f39e0620","Type":"ContainerDied","Data":"35f7a3a17a8886a6038f747c4c1a8e163a57f785ea3fe454b176a394f712ec0d"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.921103 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5","Type":"ContainerStarted","Data":"999547ebf1221aff240a1712baa33835560f7b3afdd29efdf6775cf0deedfdde"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.921370 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.925090 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerStarted","Data":"0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.936062 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5f35e63e-80c3-4dca-b383-9650e3aa63a2","Type":"ContainerStarted","Data":"c70141d228d30e05188b70c7d8a962e916bfb49bdb29c7fb3e1b123402d3c579"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.937928 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.937939 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7nd97" podStartSLOduration=20.223274269 podStartE2EDuration="27.937888276s" podCreationTimestamp="2026-03-18 13:24:12 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.745479776 +0000 UTC m=+1318.204907201" lastFinishedPulling="2026-03-18 13:24:37.460093783 +0000 UTC m=+1325.919521208" observedRunningTime="2026-03-18 13:24:39.926176741 +0000 UTC m=+1328.385604176" watchObservedRunningTime="2026-03-18 13:24:39.937888276 +0000 UTC m=+1328.397315701" Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.942220 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" event={"ID":"be9dbd3b-a78d-4306-b834-3cd7c60d7d05","Type":"ContainerStarted","Data":"58ee022e99f91eab34d4bf47c8e03843111a41555716edd936b986c708282b81"} Mar 18 13:24:39 crc kubenswrapper[4912]: I0318 13:24:39.958477 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.093468626 podStartE2EDuration="30.958451728s" podCreationTimestamp="2026-03-18 13:24:09 +0000 UTC" firstStartedPulling="2026-03-18 13:24:31.611181989 +0000 UTC m=+1320.070609414" lastFinishedPulling="2026-03-18 13:24:38.476165091 +0000 UTC m=+1326.935592516" observedRunningTime="2026-03-18 13:24:39.941659957 +0000 UTC m=+1328.401087402" watchObservedRunningTime="2026-03-18 13:24:39.958451728 +0000 UTC m=+1328.417879153" Mar 18 13:24:40 crc kubenswrapper[4912]: I0318 13:24:40.016976 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.440846095 podStartE2EDuration="36.016952611s" podCreationTimestamp="2026-03-18 13:24:04 +0000 UTC" firstStartedPulling="2026-03-18 13:24:07.08170246 +0000 UTC m=+1295.541129885" lastFinishedPulling="2026-03-18 13:24:30.657808976 +0000 UTC m=+1319.117236401" observedRunningTime="2026-03-18 13:24:39.996981354 +0000 UTC m=+1328.456408789" watchObservedRunningTime="2026-03-18 13:24:40.016952611 +0000 UTC m=+1328.476380036" Mar 18 13:24:40 crc kubenswrapper[4912]: I0318 13:24:40.052440 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.907027506 podStartE2EDuration="33.052412484s" podCreationTimestamp="2026-03-18 13:24:07 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.722106838 +0000 UTC m=+1318.181534283" lastFinishedPulling="2026-03-18 13:24:36.867491846 +0000 UTC m=+1325.326919261" observedRunningTime="2026-03-18 13:24:40.01690746 +0000 UTC m=+1328.476334895" watchObservedRunningTime="2026-03-18 13:24:40.052412484 +0000 UTC m=+1328.511839909" Mar 18 13:24:40 crc kubenswrapper[4912]: I0318 13:24:40.109953 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-w8j42" podStartSLOduration=22.646317059 podStartE2EDuration="30.10993007s" podCreationTimestamp="2026-03-18 13:24:10 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.722731195 +0000 UTC m=+1318.182158610" lastFinishedPulling="2026-03-18 13:24:37.186344196 +0000 UTC m=+1325.645771621" observedRunningTime="2026-03-18 13:24:40.04297512 +0000 UTC m=+1328.502402555" watchObservedRunningTime="2026-03-18 13:24:40.10993007 +0000 UTC m=+1328.569357505" Mar 18 13:24:40 crc kubenswrapper[4912]: I0318 13:24:40.957081 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbb6v" event={"ID":"2524b573-8f88-4fd6-8b1d-c3a4f39e0620","Type":"ContainerStarted","Data":"37b05d7cee68e334e399c16678894c4b06db39c4802df58632eda4d8215c570a"} Mar 18 13:24:41 crc kubenswrapper[4912]: I0318 13:24:41.597443 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:41 crc kubenswrapper[4912]: I0318 13:24:41.599120 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:41 crc kubenswrapper[4912]: I0318 13:24:41.606407 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:41 crc kubenswrapper[4912]: I0318 13:24:41.978798 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tbb6v" event={"ID":"2524b573-8f88-4fd6-8b1d-c3a4f39e0620","Type":"ContainerStarted","Data":"fbde5b46828df536bbe3e9248fcf21364c15d9f151253fce5659c3fa2e7471aa"} Mar 18 13:24:41 crc kubenswrapper[4912]: I0318 13:24:41.984020 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 13:24:42 crc kubenswrapper[4912]: I0318 13:24:42.009117 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tbb6v" podStartSLOduration=22.445472733 podStartE2EDuration="30.009086071s" podCreationTimestamp="2026-03-18 13:24:12 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.683362707 +0000 UTC m=+1318.142790132" lastFinishedPulling="2026-03-18 13:24:37.246976025 +0000 UTC m=+1325.706403470" observedRunningTime="2026-03-18 13:24:41.999437682 +0000 UTC m=+1330.458865117" watchObservedRunningTime="2026-03-18 13:24:42.009086071 +0000 UTC m=+1330.468513506" Mar 18 13:24:42 crc kubenswrapper[4912]: I0318 13:24:42.063940 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:24:42 crc kubenswrapper[4912]: I0318 13:24:42.711820 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:42 crc kubenswrapper[4912]: I0318 13:24:42.712258 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:24:42 crc kubenswrapper[4912]: I0318 13:24:42.993762 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerStarted","Data":"61cd9944abf703e9cd8146a733e46295eaa8dbab025cf3ccba1d6d52b50d2ed2"} Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.008473 4912 generic.go:334] "Generic (PLEG): container finished" podID="6144ea19-b46e-449b-85a2-89be0d315561" containerID="fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c" exitCode=0 Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.008549 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" event={"ID":"6144ea19-b46e-449b-85a2-89be0d315561","Type":"ContainerDied","Data":"fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c"} Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.017257 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6bc55c08-667c-4803-86d4-e30cf29b4bb6","Type":"ContainerStarted","Data":"6d2073c1d7ccab96b12cde6fddd01ba9ee83b813ae788e119ce451573478048f"} Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.023396 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b193ddb0-beb0-47c2-80c0-a301e580d2b1","Type":"ContainerStarted","Data":"829477e3de25284a32e3add97e3aff2e378d9e9cc0924ae666311f684dec40de"} Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.067247 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.491417524 podStartE2EDuration="31.067219165s" podCreationTimestamp="2026-03-18 13:24:13 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.72255481 +0000 UTC m=+1318.181982245" lastFinishedPulling="2026-03-18 13:24:43.298356461 +0000 UTC m=+1331.757783886" observedRunningTime="2026-03-18 13:24:44.054491593 +0000 UTC m=+1332.513919048" watchObservedRunningTime="2026-03-18 13:24:44.067219165 +0000 UTC m=+1332.526646580" Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.085886 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.338334372 podStartE2EDuration="29.085867006s" podCreationTimestamp="2026-03-18 13:24:15 +0000 UTC" firstStartedPulling="2026-03-18 13:24:31.52937272 +0000 UTC m=+1319.988800135" lastFinishedPulling="2026-03-18 13:24:43.276905344 +0000 UTC m=+1331.736332769" observedRunningTime="2026-03-18 13:24:44.077390248 +0000 UTC m=+1332.536817693" watchObservedRunningTime="2026-03-18 13:24:44.085867006 +0000 UTC m=+1332.545294431" Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.501846 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.502291 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:44 crc kubenswrapper[4912]: I0318 13:24:44.546154 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.033280 4912 generic.go:334] "Generic (PLEG): container finished" podID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerID="4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72" exitCode=0 Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.033790 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" event={"ID":"416535c8-1b0c-4a0a-a789-0223c011a4bb","Type":"ContainerDied","Data":"4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72"} Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.037942 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" event={"ID":"6144ea19-b46e-449b-85a2-89be0d315561","Type":"ContainerStarted","Data":"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916"} Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.038734 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.097098 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" podStartSLOduration=3.124446962 podStartE2EDuration="42.097069903s" podCreationTimestamp="2026-03-18 13:24:03 +0000 UTC" firstStartedPulling="2026-03-18 13:24:04.734879676 +0000 UTC m=+1293.194307101" lastFinishedPulling="2026-03-18 13:24:43.707502617 +0000 UTC m=+1332.166930042" observedRunningTime="2026-03-18 13:24:45.078265908 +0000 UTC m=+1333.537693343" watchObservedRunningTime="2026-03-18 13:24:45.097069903 +0000 UTC m=+1333.556497328" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.108545 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.463328 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.495425 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.497572 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.501529 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.521677 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.594531 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.594592 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.594689 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.594714 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrvt\" (UniqueName: \"kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.611078 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-qqfb7"] Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.613000 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.619115 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.633171 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qqfb7"] Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697615 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovn-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697683 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697723 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrvt\" (UniqueName: \"kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697751 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-combined-ca-bundle\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697914 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovs-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.697974 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.698021 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.698084 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.698318 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p644r\" (UniqueName: \"kubernetes.io/projected/f2b4068a-eb2f-4744-afc6-353f9704e68f-kube-api-access-p644r\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.698354 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b4068a-eb2f-4744-afc6-353f9704e68f-config\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.699818 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.700911 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.701105 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.745141 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrvt\" (UniqueName: \"kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt\") pod \"dnsmasq-dns-6bc7876d45-n9bh9\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801024 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovs-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801214 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801361 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p644r\" (UniqueName: \"kubernetes.io/projected/f2b4068a-eb2f-4744-afc6-353f9704e68f-kube-api-access-p644r\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801385 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b4068a-eb2f-4744-afc6-353f9704e68f-config\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801413 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovn-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801434 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-combined-ca-bundle\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.801799 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovs-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.802241 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f2b4068a-eb2f-4744-afc6-353f9704e68f-ovn-rundir\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.802694 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b4068a-eb2f-4744-afc6-353f9704e68f-config\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.805054 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.805991 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b4068a-eb2f-4744-afc6-353f9704e68f-combined-ca-bundle\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.818087 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.827964 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p644r\" (UniqueName: \"kubernetes.io/projected/f2b4068a-eb2f-4744-afc6-353f9704e68f-kube-api-access-p644r\") pod \"ovn-controller-metrics-qqfb7\" (UID: \"f2b4068a-eb2f-4744-afc6-353f9704e68f\") " pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:45 crc kubenswrapper[4912]: I0318 13:24:45.937721 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qqfb7" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.011598 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.085815 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.086553 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.117482 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" event={"ID":"416535c8-1b0c-4a0a-a789-0223c011a4bb","Type":"ContainerStarted","Data":"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8"} Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.117545 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="dnsmasq-dns" containerID="cri-o://999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8" gracePeriod=10 Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.117632 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.118381 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.126429 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.133495 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.219818 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.233680 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" podStartSLOduration=-9223371992.621124 podStartE2EDuration="44.23365113s" podCreationTimestamp="2026-03-18 13:24:02 +0000 UTC" firstStartedPulling="2026-03-18 13:24:04.36443728 +0000 UTC m=+1292.823864705" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:46.178638032 +0000 UTC m=+1334.638065477" watchObservedRunningTime="2026-03-18 13:24:46.23365113 +0000 UTC m=+1334.693078555" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.237002 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.255892 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22wg\" (UniqueName: \"kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.256166 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.256335 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.256736 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.358984 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.360132 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.360289 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22wg\" (UniqueName: \"kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.360344 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.360401 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.360795 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.363918 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.364656 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.365184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.384651 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.395769 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22wg\" (UniqueName: \"kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg\") pod \"dnsmasq-dns-8554648995-z9qpx\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.552297 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:46 crc kubenswrapper[4912]: W0318 13:24:46.560680 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod950ce283_d3ce_4334_9a58_78715bce4995.slice/crio-f48ad5ff6f7fcf5ed563eca491924c359b58b413da871664813e30580d580f3d WatchSource:0}: Error finding container f48ad5ff6f7fcf5ed563eca491924c359b58b413da871664813e30580d580f3d: Status 404 returned error can't find the container with id f48ad5ff6f7fcf5ed563eca491924c359b58b413da871664813e30580d580f3d Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.573653 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.651479 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qqfb7"] Mar 18 13:24:46 crc kubenswrapper[4912]: W0318 13:24:46.680633 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b4068a_eb2f_4744_afc6_353f9704e68f.slice/crio-5dec36844707677f44eeede888676a02b67dd80630736a6b86e5c4353134c366 WatchSource:0}: Error finding container 5dec36844707677f44eeede888676a02b67dd80630736a6b86e5c4353134c366: Status 404 returned error can't find the container with id 5dec36844707677f44eeede888676a02b67dd80630736a6b86e5c4353134c366 Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.707462 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.882997 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config\") pod \"416535c8-1b0c-4a0a-a789-0223c011a4bb\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.883585 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxfvp\" (UniqueName: \"kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp\") pod \"416535c8-1b0c-4a0a-a789-0223c011a4bb\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.883823 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc\") pod \"416535c8-1b0c-4a0a-a789-0223c011a4bb\" (UID: \"416535c8-1b0c-4a0a-a789-0223c011a4bb\") " Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.895248 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp" (OuterVolumeSpecName: "kube-api-access-lxfvp") pod "416535c8-1b0c-4a0a-a789-0223c011a4bb" (UID: "416535c8-1b0c-4a0a-a789-0223c011a4bb"). InnerVolumeSpecName "kube-api-access-lxfvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.970833 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config" (OuterVolumeSpecName: "config") pod "416535c8-1b0c-4a0a-a789-0223c011a4bb" (UID: "416535c8-1b0c-4a0a-a789-0223c011a4bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.978068 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "416535c8-1b0c-4a0a-a789-0223c011a4bb" (UID: "416535c8-1b0c-4a0a-a789-0223c011a4bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.986694 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.986728 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxfvp\" (UniqueName: \"kubernetes.io/projected/416535c8-1b0c-4a0a-a789-0223c011a4bb-kube-api-access-lxfvp\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:46 crc kubenswrapper[4912]: I0318 13:24:46.986746 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/416535c8-1b0c-4a0a-a789-0223c011a4bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.008857 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.008949 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.086109 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.130128 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qqfb7" event={"ID":"f2b4068a-eb2f-4744-afc6-353f9704e68f","Type":"ContainerStarted","Data":"5dec36844707677f44eeede888676a02b67dd80630736a6b86e5c4353134c366"} Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.133443 4912 generic.go:334] "Generic (PLEG): container finished" podID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerID="999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8" exitCode=0 Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.133506 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" event={"ID":"416535c8-1b0c-4a0a-a789-0223c011a4bb","Type":"ContainerDied","Data":"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8"} Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.133534 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" event={"ID":"416535c8-1b0c-4a0a-a789-0223c011a4bb","Type":"ContainerDied","Data":"052d30a92973759b276844c1b3d69e2fd9a2291b61facf441f66a5cb118418a7"} Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.133558 4912 scope.go:117] "RemoveContainer" containerID="999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.133737 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-jm6sw" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.147629 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" event={"ID":"950ce283-d3ce-4334-9a58-78715bce4995","Type":"ContainerStarted","Data":"f48ad5ff6f7fcf5ed563eca491924c359b58b413da871664813e30580d580f3d"} Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.148495 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="dnsmasq-dns" containerID="cri-o://942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916" gracePeriod=10 Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.188751 4912 scope.go:117] "RemoveContainer" containerID="4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.199921 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.212389 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-jm6sw"] Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.232959 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.235692 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.318871 4912 scope.go:117] "RemoveContainer" containerID="999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8" Mar 18 13:24:47 crc kubenswrapper[4912]: E0318 13:24:47.319913 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8\": container with ID starting with 999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8 not found: ID does not exist" containerID="999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.319961 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8"} err="failed to get container status \"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8\": rpc error: code = NotFound desc = could not find container \"999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8\": container with ID starting with 999eeb3a322093fd6720058e105a9ba1fe7dae6ee8b83da175583da917b568c8 not found: ID does not exist" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.319990 4912 scope.go:117] "RemoveContainer" containerID="4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72" Mar 18 13:24:47 crc kubenswrapper[4912]: E0318 13:24:47.320468 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72\": container with ID starting with 4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72 not found: ID does not exist" containerID="4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.320536 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72"} err="failed to get container status \"4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72\": rpc error: code = NotFound desc = could not find container \"4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72\": container with ID starting with 4ca918dd2124add25dea6796a1214718c82edb3dbab1824f1fbe771a15e68f72 not found: ID does not exist" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.368499 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.368559 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.382131 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.525174 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.556703 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:24:47 crc kubenswrapper[4912]: E0318 13:24:47.565329 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="init" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.565364 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="init" Mar 18 13:24:47 crc kubenswrapper[4912]: E0318 13:24:47.565395 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="dnsmasq-dns" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.565402 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="dnsmasq-dns" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.565619 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" containerName="dnsmasq-dns" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.566832 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.586612 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.589409 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.589637 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.595688 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-spr29" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.627163 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.627289 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85faf7f2-1b95-4210-88b1-cae393033960-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.627354 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.627532 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-config\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.627601 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.628075 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-scripts\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.628189 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/85faf7f2-1b95-4210-88b1-cae393033960-kube-api-access-mpttn\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.629294 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.669656 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746085 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-scripts\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746616 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/85faf7f2-1b95-4210-88b1-cae393033960-kube-api-access-mpttn\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746673 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746742 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85faf7f2-1b95-4210-88b1-cae393033960-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746784 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746903 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-config\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.746960 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.750224 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-scripts\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.751608 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85faf7f2-1b95-4210-88b1-cae393033960-config\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.751864 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85faf7f2-1b95-4210-88b1-cae393033960-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.765319 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.765550 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.770478 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85faf7f2-1b95-4210-88b1-cae393033960-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.800585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/85faf7f2-1b95-4210-88b1-cae393033960-kube-api-access-mpttn\") pod \"ovn-northd-0\" (UID: \"85faf7f2-1b95-4210-88b1-cae393033960\") " pod="openstack/ovn-northd-0" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.896349 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:47 crc kubenswrapper[4912]: I0318 13:24:47.953823 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.055818 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvhk2\" (UniqueName: \"kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2\") pod \"6144ea19-b46e-449b-85a2-89be0d315561\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.056004 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config\") pod \"6144ea19-b46e-449b-85a2-89be0d315561\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.056121 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc\") pod \"6144ea19-b46e-449b-85a2-89be0d315561\" (UID: \"6144ea19-b46e-449b-85a2-89be0d315561\") " Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.061587 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2" (OuterVolumeSpecName: "kube-api-access-wvhk2") pod "6144ea19-b46e-449b-85a2-89be0d315561" (UID: "6144ea19-b46e-449b-85a2-89be0d315561"). InnerVolumeSpecName "kube-api-access-wvhk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.067329 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-w9gjq"] Mar 18 13:24:48 crc kubenswrapper[4912]: E0318 13:24:48.068272 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="dnsmasq-dns" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.068324 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="dnsmasq-dns" Mar 18 13:24:48 crc kubenswrapper[4912]: E0318 13:24:48.068343 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="init" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.068352 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="init" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.068850 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6144ea19-b46e-449b-85a2-89be0d315561" containerName="dnsmasq-dns" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.070495 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.096294 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-34ea-account-create-update-jhclb"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.100458 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.105140 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.113221 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w9gjq"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.144271 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-34ea-account-create-update-jhclb"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.175974 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6144ea19-b46e-449b-85a2-89be0d315561" (UID: "6144ea19-b46e-449b-85a2-89be0d315561"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.184612 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.187127 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.187578 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvtz\" (UniqueName: \"kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.187957 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7lhn\" (UniqueName: \"kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.189746 4912 generic.go:334] "Generic (PLEG): container finished" podID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerID="5801ffc0cd0d73fbb0f23e8f6d53070d02b9ebf94757fef74cc501ba2a5593a6" exitCode=0 Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.190016 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.189206 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config" (OuterVolumeSpecName: "config") pod "6144ea19-b46e-449b-85a2-89be0d315561" (UID: "6144ea19-b46e-449b-85a2-89be0d315561"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.190176 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z9qpx" event={"ID":"a89dc2f9-7cb6-41c1-93ec-6204790a2b44","Type":"ContainerDied","Data":"5801ffc0cd0d73fbb0f23e8f6d53070d02b9ebf94757fef74cc501ba2a5593a6"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.190412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z9qpx" event={"ID":"a89dc2f9-7cb6-41c1-93ec-6204790a2b44","Type":"ContainerStarted","Data":"ea4af018898f14bd7b81ac11f6674ab11731bc45ac180fec7ca27f4a6441f4e8"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.207528 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" event={"ID":"6144ea19-b46e-449b-85a2-89be0d315561","Type":"ContainerDied","Data":"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.199938 4912 generic.go:334] "Generic (PLEG): container finished" podID="6144ea19-b46e-449b-85a2-89be0d315561" containerID="942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916" exitCode=0 Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.200068 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.208006 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-r7pms" event={"ID":"6144ea19-b46e-449b-85a2-89be0d315561","Type":"ContainerDied","Data":"6b7ba42edaae166542710e1744d4033bb27b5c0fbe4801d1cb2ff50b9e72e24b"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.207792 4912 scope.go:117] "RemoveContainer" containerID="942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.190637 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvhk2\" (UniqueName: \"kubernetes.io/projected/6144ea19-b46e-449b-85a2-89be0d315561-kube-api-access-wvhk2\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.223068 4912 generic.go:334] "Generic (PLEG): container finished" podID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerID="61cd9944abf703e9cd8146a733e46295eaa8dbab025cf3ccba1d6d52b50d2ed2" exitCode=0 Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.223217 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerDied","Data":"61cd9944abf703e9cd8146a733e46295eaa8dbab025cf3ccba1d6d52b50d2ed2"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.236474 4912 generic.go:334] "Generic (PLEG): container finished" podID="950ce283-d3ce-4334-9a58-78715bce4995" containerID="a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d" exitCode=0 Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.273897 4912 scope.go:117] "RemoveContainer" containerID="fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.282505 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416535c8-1b0c-4a0a-a789-0223c011a4bb" path="/var/lib/kubelet/pods/416535c8-1b0c-4a0a-a789-0223c011a4bb/volumes" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.286339 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" event={"ID":"950ce283-d3ce-4334-9a58-78715bce4995","Type":"ContainerDied","Data":"a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.286415 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qqfb7" event={"ID":"f2b4068a-eb2f-4744-afc6-353f9704e68f","Type":"ContainerStarted","Data":"e1893296e0f853afcc4ddd454bf38328082fea8bdbad86e7efae65a764584bb0"} Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.320485 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7lhn\" (UniqueName: \"kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.321860 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.321999 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.324229 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.326795 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvtz\" (UniqueName: \"kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.327090 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.328413 4912 scope.go:117] "RemoveContainer" containerID="942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.329339 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6144ea19-b46e-449b-85a2-89be0d315561-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:48 crc kubenswrapper[4912]: E0318 13:24:48.335914 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916\": container with ID starting with 942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916 not found: ID does not exist" containerID="942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.335970 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916"} err="failed to get container status \"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916\": rpc error: code = NotFound desc = could not find container \"942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916\": container with ID starting with 942680032c2bc7646863a3d13f7d563e04096a1c08bddca192632f88c1b75916 not found: ID does not exist" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.336006 4912 scope.go:117] "RemoveContainer" containerID="fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c" Mar 18 13:24:48 crc kubenswrapper[4912]: E0318 13:24:48.336725 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c\": container with ID starting with fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c not found: ID does not exist" containerID="fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.336793 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c"} err="failed to get container status \"fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c\": rpc error: code = NotFound desc = could not find container \"fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c\": container with ID starting with fd9b22c3cb20cdc373295f43124fb8b1affeab418680cd7cd8811cb40559ca5c not found: ID does not exist" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.375023 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvtz\" (UniqueName: \"kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz\") pod \"glance-db-create-w9gjq\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.376413 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-qqfb7" podStartSLOduration=3.376390087 podStartE2EDuration="3.376390087s" podCreationTimestamp="2026-03-18 13:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:48.329461066 +0000 UTC m=+1336.788888491" watchObservedRunningTime="2026-03-18 13:24:48.376390087 +0000 UTC m=+1336.835817512" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.386026 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7lhn\" (UniqueName: \"kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn\") pod \"glance-34ea-account-create-update-jhclb\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.398886 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.418264 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-r7pms"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.441768 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l67rx"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.443466 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.463680 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l67rx"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.505226 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.508291 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.516226 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.537871 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.537963 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ljk\" (UniqueName: \"kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.596467 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.605773 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f5d9-account-create-update-wrz5v"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.615859 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.623741 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.635626 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f5d9-account-create-update-wrz5v"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.643319 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ljk\" (UniqueName: \"kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.643525 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.666408 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.681556 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ljk\" (UniqueName: \"kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk\") pod \"keystone-db-create-l67rx\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.746865 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.747029 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqcj\" (UniqueName: \"kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.760018 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-d4t94"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.772828 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.786818 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.791365 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d4t94"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.825158 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8eb0-account-create-update-mls4v"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.828535 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.834221 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.843246 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8eb0-account-create-update-mls4v"] Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.853925 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.854139 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqcj\" (UniqueName: \"kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.854358 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lhm\" (UniqueName: \"kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.854497 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.855953 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.902979 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqcj\" (UniqueName: \"kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj\") pod \"keystone-f5d9-account-create-update-wrz5v\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.960260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szm7k\" (UniqueName: \"kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.960463 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.960578 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lhm\" (UniqueName: \"kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.960663 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.961690 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.977164 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:48 crc kubenswrapper[4912]: I0318 13:24:48.991173 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lhm\" (UniqueName: \"kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm\") pod \"placement-db-create-d4t94\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " pod="openstack/placement-db-create-d4t94" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.065458 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szm7k\" (UniqueName: \"kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.065548 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.071176 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.080567 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4t94" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.088172 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szm7k\" (UniqueName: \"kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k\") pod \"placement-8eb0-account-create-update-mls4v\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.103521 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.298062 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" event={"ID":"950ce283-d3ce-4334-9a58-78715bce4995","Type":"ContainerStarted","Data":"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2"} Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.298572 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.331659 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w9gjq"] Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.336121 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z9qpx" event={"ID":"a89dc2f9-7cb6-41c1-93ec-6204790a2b44","Type":"ContainerStarted","Data":"009c5d2eb5fac2ed6b862f524b15bdf3bfffba4642973c8bf24872407bdcfd2e"} Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.338525 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.352934 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"85faf7f2-1b95-4210-88b1-cae393033960","Type":"ContainerStarted","Data":"76c054acc38b2e15bdb3052992a5bb2c8e138ec37848a68943879ba39891ddfc"} Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.360305 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" podStartSLOduration=4.36027082 podStartE2EDuration="4.36027082s" podCreationTimestamp="2026-03-18 13:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:49.327934351 +0000 UTC m=+1337.787361786" watchObservedRunningTime="2026-03-18 13:24:49.36027082 +0000 UTC m=+1337.819698245" Mar 18 13:24:49 crc kubenswrapper[4912]: W0318 13:24:49.371082 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fc016d4_4749_4456_92a5_b66653a5ef44.slice/crio-4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad WatchSource:0}: Error finding container 4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad: Status 404 returned error can't find the container with id 4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.428131 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-34ea-account-create-update-jhclb"] Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.444921 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-z9qpx" podStartSLOduration=3.444889285 podStartE2EDuration="3.444889285s" podCreationTimestamp="2026-03-18 13:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:49.387168863 +0000 UTC m=+1337.846596288" watchObservedRunningTime="2026-03-18 13:24:49.444889285 +0000 UTC m=+1337.904316710" Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.672298 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l67rx"] Mar 18 13:24:49 crc kubenswrapper[4912]: W0318 13:24:49.717232 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821a74e9_e276_47c9_8401_7de9010901bf.slice/crio-b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a WatchSource:0}: Error finding container b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a: Status 404 returned error can't find the container with id b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a Mar 18 13:24:49 crc kubenswrapper[4912]: I0318 13:24:49.755137 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f5d9-account-create-update-wrz5v"] Mar 18 13:24:49 crc kubenswrapper[4912]: W0318 13:24:49.782392 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef449253_2c62_47bf_a6aa_513c1c28de28.slice/crio-2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62 WatchSource:0}: Error finding container 2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62: Status 404 returned error can't find the container with id 2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62 Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.117892 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c8cdk"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.120526 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.147944 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d4t94"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.163004 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c8cdk"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.210695 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.210805 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwjnc\" (UniqueName: \"kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.315749 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.315908 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwjnc\" (UniqueName: \"kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.317698 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.330007 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6144ea19-b46e-449b-85a2-89be0d315561" path="/var/lib/kubelet/pods/6144ea19-b46e-449b-85a2-89be0d315561/volumes" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.344018 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.344106 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.355389 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.355516 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.381334 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwjnc\" (UniqueName: \"kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc\") pod \"mysqld-exporter-openstack-db-create-c8cdk\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.414423 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.424005 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e14e-account-create-update-jt7n6"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.437726 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.438023 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f5d9-account-create-update-wrz5v" event={"ID":"ef449253-2c62-47bf-a6aa-513c1c28de28","Type":"ContainerStarted","Data":"2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.465343 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.477060 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w9gjq" event={"ID":"9fc016d4-4749-4456-92a5-b66653a5ef44","Type":"ContainerStarted","Data":"400cdf6a5f4d818bb783d661246690a10d698896cc7940a218a9380b1378cc87"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.477140 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w9gjq" event={"ID":"9fc016d4-4749-4456-92a5-b66653a5ef44","Type":"ContainerStarted","Data":"4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.525746 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-34ea-account-create-update-jhclb" event={"ID":"e6ed6e1a-405b-4599-8c25-abb534946198","Type":"ContainerStarted","Data":"ac8e1b96612ce2bbc73571c2b84205073b8a8d919dfad7f790dc49c65dbde137"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.526270 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-34ea-account-create-update-jhclb" event={"ID":"e6ed6e1a-405b-4599-8c25-abb534946198","Type":"ContainerStarted","Data":"06eafac19b2e404b5c272d36fe9ee30de181ce449aeaacf0e636567cfdf8a7a1"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.532721 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4t94" event={"ID":"f3a595f8-7a9b-4c81-ab92-71a0618740e3","Type":"ContainerStarted","Data":"7f64e2aa6d99a700732aadd18f97d07832befd8e9cd061ba1919b9987cc29a5f"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.560759 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e14e-account-create-update-jt7n6"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.561173 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l67rx" event={"ID":"821a74e9-e276-47c9-8401-7de9010901bf","Type":"ContainerStarted","Data":"b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a"} Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566431 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566609 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566715 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566767 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dvw\" (UniqueName: \"kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566800 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.566889 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vrt\" (UniqueName: \"kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.570456 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.622632 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.679802 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.679981 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.680075 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.680105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dvw\" (UniqueName: \"kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.680128 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.680195 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vrt\" (UniqueName: \"kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.680238 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.684730 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.686290 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-w9gjq" podStartSLOduration=3.686268277 podStartE2EDuration="3.686268277s" podCreationTimestamp="2026-03-18 13:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:50.536356048 +0000 UTC m=+1338.995783493" watchObservedRunningTime="2026-03-18 13:24:50.686268277 +0000 UTC m=+1339.145695702" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.686396 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.687692 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.687857 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.688779 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.710874 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vrt\" (UniqueName: \"kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt\") pod \"mysqld-exporter-e14e-account-create-update-jt7n6\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.713008 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dvw\" (UniqueName: \"kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw\") pod \"dnsmasq-dns-b8fbc5445-449s7\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.720207 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8eb0-account-create-update-mls4v"] Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.742281 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-34ea-account-create-update-jhclb" podStartSLOduration=3.742248272 podStartE2EDuration="3.742248272s" podCreationTimestamp="2026-03-18 13:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:50.577582246 +0000 UTC m=+1339.037009671" watchObservedRunningTime="2026-03-18 13:24:50.742248272 +0000 UTC m=+1339.201675697" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.785335 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:50 crc kubenswrapper[4912]: I0318 13:24:50.796825 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.239004 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c8cdk"] Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.458219 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.472753 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.479715 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.480021 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-w7h22" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.480291 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.480473 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.496705 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.586963 4912 generic.go:334] "Generic (PLEG): container finished" podID="9fc016d4-4749-4456-92a5-b66653a5ef44" containerID="400cdf6a5f4d818bb783d661246690a10d698896cc7940a218a9380b1378cc87" exitCode=0 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.587183 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w9gjq" event={"ID":"9fc016d4-4749-4456-92a5-b66653a5ef44","Type":"ContainerDied","Data":"400cdf6a5f4d818bb783d661246690a10d698896cc7940a218a9380b1378cc87"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.594259 4912 generic.go:334] "Generic (PLEG): container finished" podID="e6ed6e1a-405b-4599-8c25-abb534946198" containerID="ac8e1b96612ce2bbc73571c2b84205073b8a8d919dfad7f790dc49c65dbde137" exitCode=0 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.594547 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-34ea-account-create-update-jhclb" event={"ID":"e6ed6e1a-405b-4599-8c25-abb534946198","Type":"ContainerDied","Data":"ac8e1b96612ce2bbc73571c2b84205073b8a8d919dfad7f790dc49c65dbde137"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.596617 4912 generic.go:334] "Generic (PLEG): container finished" podID="f3a595f8-7a9b-4c81-ab92-71a0618740e3" containerID="fa7b12ecc7c712600e34cec47150811b9f52ca500907a4dbe254b7e58775c327" exitCode=0 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.596771 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4t94" event={"ID":"f3a595f8-7a9b-4c81-ab92-71a0618740e3","Type":"ContainerDied","Data":"fa7b12ecc7c712600e34cec47150811b9f52ca500907a4dbe254b7e58775c327"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.601801 4912 generic.go:334] "Generic (PLEG): container finished" podID="821a74e9-e276-47c9-8401-7de9010901bf" containerID="dff4a79f7afa21763a6f27f98e7a28b52b90389b22866c2264a4fb27db600efd" exitCode=0 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.601862 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l67rx" event={"ID":"821a74e9-e276-47c9-8401-7de9010901bf","Type":"ContainerDied","Data":"dff4a79f7afa21763a6f27f98e7a28b52b90389b22866c2264a4fb27db600efd"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.613766 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.613818 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw88p\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-kube-api-access-vw88p\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.613848 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-lock\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.613922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.613944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-cache\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.614018 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f71e79a-72ad-4de7-9b24-7ac75884deae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.614853 4912 generic.go:334] "Generic (PLEG): container finished" podID="ef449253-2c62-47bf-a6aa-513c1c28de28" containerID="b51340747b1ea1d80e2f5c93d896ea9566f503f0a45e533a8553bb438e865f0a" exitCode=0 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.614913 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f5d9-account-create-update-wrz5v" event={"ID":"ef449253-2c62-47bf-a6aa-513c1c28de28","Type":"ContainerDied","Data":"b51340747b1ea1d80e2f5c93d896ea9566f503f0a45e533a8553bb438e865f0a"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.617087 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="dnsmasq-dns" containerID="cri-o://1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2" gracePeriod=10 Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.617885 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8eb0-account-create-update-mls4v" event={"ID":"8a03ef05-1ed6-4212-b1b6-c904231640f8","Type":"ContainerStarted","Data":"3ce2b7e9a4d84c9e08cdcead47663224e500e4dcf6e29ab26f1b66bfe95be8b4"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.617914 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8eb0-account-create-update-mls4v" event={"ID":"8a03ef05-1ed6-4212-b1b6-c904231640f8","Type":"ContainerStarted","Data":"dc532b389170134e41f8a6fe4ebf5fc760a3fcfb9846932d8c552abee788021a"} Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.707671 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8eb0-account-create-update-mls4v" podStartSLOduration=3.707642918 podStartE2EDuration="3.707642918s" podCreationTimestamp="2026-03-18 13:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:51.692524061 +0000 UTC m=+1340.151951486" watchObservedRunningTime="2026-03-18 13:24:51.707642918 +0000 UTC m=+1340.167070343" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.716725 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.716820 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-cache\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: E0318 13:24:51.717198 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:24:51 crc kubenswrapper[4912]: E0318 13:24:51.717241 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:24:51 crc kubenswrapper[4912]: E0318 13:24:51.717312 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:24:52.217291677 +0000 UTC m=+1340.676719102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.717727 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-cache\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.718265 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f71e79a-72ad-4de7-9b24-7ac75884deae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.718643 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.718726 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw88p\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-kube-api-access-vw88p\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.718803 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-lock\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.719774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8f71e79a-72ad-4de7-9b24-7ac75884deae-lock\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.728358 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.728409 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f5bf38ea1e39a6ab72bed427a38ca4b4544a34026fbaeca8b8ab2a347d5a886/globalmount\"" pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.733430 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f71e79a-72ad-4de7-9b24-7ac75884deae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.743774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw88p\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-kube-api-access-vw88p\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:51 crc kubenswrapper[4912]: I0318 13:24:51.787938 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09a99cf9-c2db-4b48-a4a1-f98c26449662\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.244521 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:52 crc kubenswrapper[4912]: E0318 13:24:52.245756 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:24:52 crc kubenswrapper[4912]: E0318 13:24:52.245780 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:24:52 crc kubenswrapper[4912]: E0318 13:24:52.245844 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:24:53.245826612 +0000 UTC m=+1341.705254037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.320576 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.469262 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc\") pod \"950ce283-d3ce-4334-9a58-78715bce4995\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.469522 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config\") pod \"950ce283-d3ce-4334-9a58-78715bce4995\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.469686 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbrvt\" (UniqueName: \"kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt\") pod \"950ce283-d3ce-4334-9a58-78715bce4995\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.469768 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb\") pod \"950ce283-d3ce-4334-9a58-78715bce4995\" (UID: \"950ce283-d3ce-4334-9a58-78715bce4995\") " Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.477261 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt" (OuterVolumeSpecName: "kube-api-access-gbrvt") pod "950ce283-d3ce-4334-9a58-78715bce4995" (UID: "950ce283-d3ce-4334-9a58-78715bce4995"). InnerVolumeSpecName "kube-api-access-gbrvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.477362 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e14e-account-create-update-jt7n6"] Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.491282 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.530755 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "950ce283-d3ce-4334-9a58-78715bce4995" (UID: "950ce283-d3ce-4334-9a58-78715bce4995"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.572343 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbrvt\" (UniqueName: \"kubernetes.io/projected/950ce283-d3ce-4334-9a58-78715bce4995-kube-api-access-gbrvt\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.572377 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.572944 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config" (OuterVolumeSpecName: "config") pod "950ce283-d3ce-4334-9a58-78715bce4995" (UID: "950ce283-d3ce-4334-9a58-78715bce4995"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.579729 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "950ce283-d3ce-4334-9a58-78715bce4995" (UID: "950ce283-d3ce-4334-9a58-78715bce4995"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.634083 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"85faf7f2-1b95-4210-88b1-cae393033960","Type":"ContainerStarted","Data":"6f95adfdba81755e528f35dc6c7b384201591dd526a8c4ceaa9ac33f63c8962b"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.634263 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.635495 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" event={"ID":"ae016812-7be3-4001-822e-9979bd4ce648","Type":"ContainerStarted","Data":"f1412cdb532d7d81c4a674ac082fe9bfc07ec9c48e8f0df40cbe6f12704603e0"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.640580 4912 generic.go:334] "Generic (PLEG): container finished" podID="950ce283-d3ce-4334-9a58-78715bce4995" containerID="1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2" exitCode=0 Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.640665 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" event={"ID":"950ce283-d3ce-4334-9a58-78715bce4995","Type":"ContainerDied","Data":"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.640704 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" event={"ID":"950ce283-d3ce-4334-9a58-78715bce4995","Type":"ContainerDied","Data":"f48ad5ff6f7fcf5ed563eca491924c359b58b413da871664813e30580d580f3d"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.640727 4912 scope.go:117] "RemoveContainer" containerID="1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.640925 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-n9bh9" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.661968 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" event={"ID":"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec","Type":"ContainerStarted","Data":"e0770c208c3556f1b1bdda9a91b3fde588d9ae40816498db07a5fe39e9c530b1"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.666852 4912 generic.go:334] "Generic (PLEG): container finished" podID="218793dd-ce60-49c1-87b3-43176d51e230" containerID="3021e363fa6e06db15905c89f5fad26489c0f3a64358ea0805af2b0d25105304" exitCode=0 Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.666980 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" event={"ID":"218793dd-ce60-49c1-87b3-43176d51e230","Type":"ContainerDied","Data":"3021e363fa6e06db15905c89f5fad26489c0f3a64358ea0805af2b0d25105304"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.667020 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" event={"ID":"218793dd-ce60-49c1-87b3-43176d51e230","Type":"ContainerStarted","Data":"fdfd25405f846bba75bdcb2951de522a91c95ce4f966853b026ce7438017bd18"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.672330 4912 generic.go:334] "Generic (PLEG): container finished" podID="8a03ef05-1ed6-4212-b1b6-c904231640f8" containerID="3ce2b7e9a4d84c9e08cdcead47663224e500e4dcf6e29ab26f1b66bfe95be8b4" exitCode=0 Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.672637 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8eb0-account-create-update-mls4v" event={"ID":"8a03ef05-1ed6-4212-b1b6-c904231640f8","Type":"ContainerDied","Data":"3ce2b7e9a4d84c9e08cdcead47663224e500e4dcf6e29ab26f1b66bfe95be8b4"} Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.674662 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.674690 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/950ce283-d3ce-4334-9a58-78715bce4995-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.669021 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.37329077 podStartE2EDuration="5.668995515s" podCreationTimestamp="2026-03-18 13:24:47 +0000 UTC" firstStartedPulling="2026-03-18 13:24:48.625865092 +0000 UTC m=+1337.085292517" lastFinishedPulling="2026-03-18 13:24:51.921569837 +0000 UTC m=+1340.380997262" observedRunningTime="2026-03-18 13:24:52.654572127 +0000 UTC m=+1341.113999582" watchObservedRunningTime="2026-03-18 13:24:52.668995515 +0000 UTC m=+1341.128422940" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.700932 4912 scope.go:117] "RemoveContainer" containerID="a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.756678 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.764795 4912 scope.go:117] "RemoveContainer" containerID="1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2" Mar 18 13:24:52 crc kubenswrapper[4912]: E0318 13:24:52.765382 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2\": container with ID starting with 1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2 not found: ID does not exist" containerID="1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.765422 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2"} err="failed to get container status \"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2\": rpc error: code = NotFound desc = could not find container \"1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2\": container with ID starting with 1a96ecba5ea0ef2912d71d6e329e8bc0272e49d052dcd20ad372a1e3dc2d8df2 not found: ID does not exist" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.765449 4912 scope.go:117] "RemoveContainer" containerID="a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d" Mar 18 13:24:52 crc kubenswrapper[4912]: E0318 13:24:52.765903 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d\": container with ID starting with a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d not found: ID does not exist" containerID="a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.765933 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d"} err="failed to get container status \"a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d\": rpc error: code = NotFound desc = could not find container \"a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d\": container with ID starting with a10df46375d0172ba9d2389b1d8793b0daa4e2aa19cc75080833388490ebd55d not found: ID does not exist" Mar 18 13:24:52 crc kubenswrapper[4912]: I0318 13:24:52.768088 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-n9bh9"] Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.207627 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.290699 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4ljk\" (UniqueName: \"kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk\") pod \"821a74e9-e276-47c9-8401-7de9010901bf\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.290879 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts\") pod \"821a74e9-e276-47c9-8401-7de9010901bf\" (UID: \"821a74e9-e276-47c9-8401-7de9010901bf\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.291646 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:53 crc kubenswrapper[4912]: E0318 13:24:53.292070 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:24:53 crc kubenswrapper[4912]: E0318 13:24:53.292120 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:24:53 crc kubenswrapper[4912]: E0318 13:24:53.292202 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:24:55.292176684 +0000 UTC m=+1343.751604109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.293560 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "821a74e9-e276-47c9-8401-7de9010901bf" (UID: "821a74e9-e276-47c9-8401-7de9010901bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.308153 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk" (OuterVolumeSpecName: "kube-api-access-k4ljk") pod "821a74e9-e276-47c9-8401-7de9010901bf" (UID: "821a74e9-e276-47c9-8401-7de9010901bf"). InnerVolumeSpecName "kube-api-access-k4ljk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.396944 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4ljk\" (UniqueName: \"kubernetes.io/projected/821a74e9-e276-47c9-8401-7de9010901bf-kube-api-access-k4ljk\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.396994 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821a74e9-e276-47c9-8401-7de9010901bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.693329 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f5d9-account-create-update-wrz5v" event={"ID":"ef449253-2c62-47bf-a6aa-513c1c28de28","Type":"ContainerDied","Data":"2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.693431 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9967561a099cb2e1265e293110a7876b8295995b41f2dcf8db229b66b36d62" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.696671 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w9gjq" event={"ID":"9fc016d4-4749-4456-92a5-b66653a5ef44","Type":"ContainerDied","Data":"4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.696701 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f1bdd77c76393139e5bcacf861da1c52dfb898dde6567382bb597621a2281ad" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.700519 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"85faf7f2-1b95-4210-88b1-cae393033960","Type":"ContainerStarted","Data":"818829698d16bd0704fe53da83446409ba7d7f2c68608daa5dacc2038c7de4d3"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.702134 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.703350 4912 generic.go:334] "Generic (PLEG): container finished" podID="39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" containerID="91206e7c251655068ce028fb1d382d87d4d66cd7c4e9d12c0e586bf6aff6ba57" exitCode=0 Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.703419 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" event={"ID":"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec","Type":"ContainerDied","Data":"91206e7c251655068ce028fb1d382d87d4d66cd7c4e9d12c0e586bf6aff6ba57"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.715985 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.716019 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4t94" event={"ID":"f3a595f8-7a9b-4c81-ab92-71a0618740e3","Type":"ContainerDied","Data":"7f64e2aa6d99a700732aadd18f97d07832befd8e9cd061ba1919b9987cc29a5f"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.716114 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f64e2aa6d99a700732aadd18f97d07832befd8e9cd061ba1919b9987cc29a5f" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.728888 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4t94" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.729899 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l67rx" event={"ID":"821a74e9-e276-47c9-8401-7de9010901bf","Type":"ContainerDied","Data":"b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.729937 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1dea17c23056924760e2fb940696dd8b185d0c20881f13b3678b2bf2ca4a56a" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.729985 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l67rx" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.744521 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.745888 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-34ea-account-create-update-jhclb" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.745929 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-34ea-account-create-update-jhclb" event={"ID":"e6ed6e1a-405b-4599-8c25-abb534946198","Type":"ContainerDied","Data":"06eafac19b2e404b5c272d36fe9ee30de181ce449aeaacf0e636567cfdf8a7a1"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.745972 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06eafac19b2e404b5c272d36fe9ee30de181ce449aeaacf0e636567cfdf8a7a1" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.763927 4912 generic.go:334] "Generic (PLEG): container finished" podID="ae016812-7be3-4001-822e-9979bd4ce648" containerID="911443aed4c6547a6b79646429ce2f624e9ba8e926b2288c0dda2cfeed9cfe5c" exitCode=0 Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.764072 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" event={"ID":"ae016812-7be3-4001-822e-9979bd4ce648","Type":"ContainerDied","Data":"911443aed4c6547a6b79646429ce2f624e9ba8e926b2288c0dda2cfeed9cfe5c"} Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805018 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvtz\" (UniqueName: \"kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz\") pod \"9fc016d4-4749-4456-92a5-b66653a5ef44\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805160 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts\") pod \"e6ed6e1a-405b-4599-8c25-abb534946198\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805269 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctqcj\" (UniqueName: \"kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj\") pod \"ef449253-2c62-47bf-a6aa-513c1c28de28\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805455 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts\") pod \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805573 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25lhm\" (UniqueName: \"kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm\") pod \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\" (UID: \"f3a595f8-7a9b-4c81-ab92-71a0618740e3\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805653 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts\") pod \"9fc016d4-4749-4456-92a5-b66653a5ef44\" (UID: \"9fc016d4-4749-4456-92a5-b66653a5ef44\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805684 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7lhn\" (UniqueName: \"kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn\") pod \"e6ed6e1a-405b-4599-8c25-abb534946198\" (UID: \"e6ed6e1a-405b-4599-8c25-abb534946198\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.805765 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts\") pod \"ef449253-2c62-47bf-a6aa-513c1c28de28\" (UID: \"ef449253-2c62-47bf-a6aa-513c1c28de28\") " Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.806721 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6ed6e1a-405b-4599-8c25-abb534946198" (UID: "e6ed6e1a-405b-4599-8c25-abb534946198"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.807291 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fc016d4-4749-4456-92a5-b66653a5ef44" (UID: "9fc016d4-4749-4456-92a5-b66653a5ef44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.807689 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3a595f8-7a9b-4c81-ab92-71a0618740e3" (UID: "f3a595f8-7a9b-4c81-ab92-71a0618740e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.808625 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef449253-2c62-47bf-a6aa-513c1c28de28" (UID: "ef449253-2c62-47bf-a6aa-513c1c28de28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.812368 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm" (OuterVolumeSpecName: "kube-api-access-25lhm") pod "f3a595f8-7a9b-4c81-ab92-71a0618740e3" (UID: "f3a595f8-7a9b-4c81-ab92-71a0618740e3"). InnerVolumeSpecName "kube-api-access-25lhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.813826 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz" (OuterVolumeSpecName: "kube-api-access-mfvtz") pod "9fc016d4-4749-4456-92a5-b66653a5ef44" (UID: "9fc016d4-4749-4456-92a5-b66653a5ef44"). InnerVolumeSpecName "kube-api-access-mfvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.815707 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj" (OuterVolumeSpecName: "kube-api-access-ctqcj") pod "ef449253-2c62-47bf-a6aa-513c1c28de28" (UID: "ef449253-2c62-47bf-a6aa-513c1c28de28"). InnerVolumeSpecName "kube-api-access-ctqcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.818627 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn" (OuterVolumeSpecName: "kube-api-access-r7lhn") pod "e6ed6e1a-405b-4599-8c25-abb534946198" (UID: "e6ed6e1a-405b-4599-8c25-abb534946198"). InnerVolumeSpecName "kube-api-access-r7lhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913031 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25lhm\" (UniqueName: \"kubernetes.io/projected/f3a595f8-7a9b-4c81-ab92-71a0618740e3-kube-api-access-25lhm\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913114 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc016d4-4749-4456-92a5-b66653a5ef44-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913127 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7lhn\" (UniqueName: \"kubernetes.io/projected/e6ed6e1a-405b-4599-8c25-abb534946198-kube-api-access-r7lhn\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913142 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef449253-2c62-47bf-a6aa-513c1c28de28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913154 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvtz\" (UniqueName: \"kubernetes.io/projected/9fc016d4-4749-4456-92a5-b66653a5ef44-kube-api-access-mfvtz\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913168 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6ed6e1a-405b-4599-8c25-abb534946198-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913182 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctqcj\" (UniqueName: \"kubernetes.io/projected/ef449253-2c62-47bf-a6aa-513c1c28de28-kube-api-access-ctqcj\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:53 crc kubenswrapper[4912]: I0318 13:24:53.913194 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3a595f8-7a9b-4c81-ab92-71a0618740e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.307201 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="950ce283-d3ce-4334-9a58-78715bce4995" path="/var/lib/kubelet/pods/950ce283-d3ce-4334-9a58-78715bce4995/volumes" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.378573 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.443167 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwjnc\" (UniqueName: \"kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc\") pod \"218793dd-ce60-49c1-87b3-43176d51e230\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.443346 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts\") pod \"218793dd-ce60-49c1-87b3-43176d51e230\" (UID: \"218793dd-ce60-49c1-87b3-43176d51e230\") " Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.448493 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "218793dd-ce60-49c1-87b3-43176d51e230" (UID: "218793dd-ce60-49c1-87b3-43176d51e230"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.453823 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc" (OuterVolumeSpecName: "kube-api-access-fwjnc") pod "218793dd-ce60-49c1-87b3-43176d51e230" (UID: "218793dd-ce60-49c1-87b3-43176d51e230"). InnerVolumeSpecName "kube-api-access-fwjnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.455055 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.455855 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kt5tk"] Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.456556 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ed6e1a-405b-4599-8c25-abb534946198" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.456583 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ed6e1a-405b-4599-8c25-abb534946198" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457315 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821a74e9-e276-47c9-8401-7de9010901bf" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457338 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="821a74e9-e276-47c9-8401-7de9010901bf" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457349 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="dnsmasq-dns" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457356 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="dnsmasq-dns" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457371 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218793dd-ce60-49c1-87b3-43176d51e230" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457379 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="218793dd-ce60-49c1-87b3-43176d51e230" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457400 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc016d4-4749-4456-92a5-b66653a5ef44" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457426 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc016d4-4749-4456-92a5-b66653a5ef44" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457458 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a595f8-7a9b-4c81-ab92-71a0618740e3" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457467 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a595f8-7a9b-4c81-ab92-71a0618740e3" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457484 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="init" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457493 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="init" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457513 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a03ef05-1ed6-4212-b1b6-c904231640f8" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457522 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a03ef05-1ed6-4212-b1b6-c904231640f8" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: E0318 13:24:54.457542 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef449253-2c62-47bf-a6aa-513c1c28de28" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457551 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef449253-2c62-47bf-a6aa-513c1c28de28" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457858 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef449253-2c62-47bf-a6aa-513c1c28de28" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457879 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="950ce283-d3ce-4334-9a58-78715bce4995" containerName="dnsmasq-dns" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457895 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ed6e1a-405b-4599-8c25-abb534946198" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457907 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="821a74e9-e276-47c9-8401-7de9010901bf" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457915 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a595f8-7a9b-4c81-ab92-71a0618740e3" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457931 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="218793dd-ce60-49c1-87b3-43176d51e230" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457943 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a03ef05-1ed6-4212-b1b6-c904231640f8" containerName="mariadb-account-create-update" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.457959 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc016d4-4749-4456-92a5-b66653a5ef44" containerName="mariadb-database-create" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.459015 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.462597 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.474853 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kt5tk"] Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.545393 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts\") pod \"8a03ef05-1ed6-4212-b1b6-c904231640f8\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.546138 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a03ef05-1ed6-4212-b1b6-c904231640f8" (UID: "8a03ef05-1ed6-4212-b1b6-c904231640f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.546151 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szm7k\" (UniqueName: \"kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k\") pod \"8a03ef05-1ed6-4212-b1b6-c904231640f8\" (UID: \"8a03ef05-1ed6-4212-b1b6-c904231640f8\") " Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.547328 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8tr\" (UniqueName: \"kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.547383 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.547470 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwjnc\" (UniqueName: \"kubernetes.io/projected/218793dd-ce60-49c1-87b3-43176d51e230-kube-api-access-fwjnc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.547483 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a03ef05-1ed6-4212-b1b6-c904231640f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.547492 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/218793dd-ce60-49c1-87b3-43176d51e230-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.552112 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k" (OuterVolumeSpecName: "kube-api-access-szm7k") pod "8a03ef05-1ed6-4212-b1b6-c904231640f8" (UID: "8a03ef05-1ed6-4212-b1b6-c904231640f8"). InnerVolumeSpecName "kube-api-access-szm7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.650733 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8tr\" (UniqueName: \"kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.650827 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.651085 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szm7k\" (UniqueName: \"kubernetes.io/projected/8a03ef05-1ed6-4212-b1b6-c904231640f8-kube-api-access-szm7k\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.652162 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.671189 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8tr\" (UniqueName: \"kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr\") pod \"root-account-create-update-kt5tk\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.783761 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt5tk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.792572 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" event={"ID":"ae016812-7be3-4001-822e-9979bd4ce648","Type":"ContainerStarted","Data":"c310ab7b8dc3896908da85cfc27b03735933f37091211f10c8fb3db17cfe4826"} Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.792668 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.803495 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" event={"ID":"218793dd-ce60-49c1-87b3-43176d51e230","Type":"ContainerDied","Data":"fdfd25405f846bba75bdcb2951de522a91c95ce4f966853b026ce7438017bd18"} Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.803569 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdfd25405f846bba75bdcb2951de522a91c95ce4f966853b026ce7438017bd18" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.803588 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c8cdk" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.839542 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w9gjq" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.839672 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8eb0-account-create-update-mls4v" event={"ID":"8a03ef05-1ed6-4212-b1b6-c904231640f8","Type":"ContainerDied","Data":"dc532b389170134e41f8a6fe4ebf5fc760a3fcfb9846932d8c552abee788021a"} Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.839752 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc532b389170134e41f8a6fe4ebf5fc760a3fcfb9846932d8c552abee788021a" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.839946 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8eb0-account-create-update-mls4v" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.840011 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" podStartSLOduration=4.839987642 podStartE2EDuration="4.839987642s" podCreationTimestamp="2026-03-18 13:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:54.81981515 +0000 UTC m=+1343.279242585" watchObservedRunningTime="2026-03-18 13:24:54.839987642 +0000 UTC m=+1343.299415067" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.840103 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4t94" Mar 18 13:24:54 crc kubenswrapper[4912]: I0318 13:24:54.852138 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f5d9-account-create-update-wrz5v" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.374822 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:55 crc kubenswrapper[4912]: E0318 13:24:55.375424 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:24:55 crc kubenswrapper[4912]: E0318 13:24:55.375439 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:24:55 crc kubenswrapper[4912]: E0318 13:24:55.375497 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:24:59.375477194 +0000 UTC m=+1347.834904619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.419125 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kt5tk"] Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.426586 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cczr8"] Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.428357 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.440454 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.440893 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.440969 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.484181 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cczr8"] Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.580940 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.580990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cpn\" (UniqueName: \"kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.581062 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.581093 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.581119 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.581147 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.581225 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.584602 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.683283 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts\") pod \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.683636 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2vrt\" (UniqueName: \"kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt\") pod \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\" (UID: \"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec\") " Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.683998 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684057 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684089 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684333 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684365 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cpn\" (UniqueName: \"kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.684404 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.686849 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" (UID: "39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.686959 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.687556 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.688120 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.696405 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.696932 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.697216 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt" (OuterVolumeSpecName: "kube-api-access-c2vrt") pod "39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" (UID: "39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec"). InnerVolumeSpecName "kube-api-access-c2vrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.698095 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.711940 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cpn\" (UniqueName: \"kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn\") pod \"swift-ring-rebalance-cczr8\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.787404 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2vrt\" (UniqueName: \"kubernetes.io/projected/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-kube-api-access-c2vrt\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.787862 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.846839 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.861006 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt5tk" event={"ID":"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3","Type":"ContainerStarted","Data":"3fd550dbc93f008f2754403ab34ba07cead51e32b8c1e5d7aa45fa45938b6168"} Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.867837 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" event={"ID":"39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec","Type":"ContainerDied","Data":"e0770c208c3556f1b1bdda9a91b3fde588d9ae40816498db07a5fe39e9c530b1"} Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.867896 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0770c208c3556f1b1bdda9a91b3fde588d9ae40816498db07a5fe39e9c530b1" Mar 18 13:24:55 crc kubenswrapper[4912]: I0318 13:24:55.867901 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e14e-account-create-update-jt7n6" Mar 18 13:24:56 crc kubenswrapper[4912]: W0318 13:24:56.414300 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b02fe29_bc51_4fc4_86e7_44fb75e20e2b.slice/crio-5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198 WatchSource:0}: Error finding container 5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198: Status 404 returned error can't find the container with id 5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198 Mar 18 13:24:56 crc kubenswrapper[4912]: I0318 13:24:56.420636 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cczr8"] Mar 18 13:24:56 crc kubenswrapper[4912]: I0318 13:24:56.580163 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:24:56 crc kubenswrapper[4912]: I0318 13:24:56.885008 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cczr8" event={"ID":"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b","Type":"ContainerStarted","Data":"5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198"} Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.244755 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-krlbd"] Mar 18 13:24:58 crc kubenswrapper[4912]: E0318 13:24:58.247825 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" containerName="mariadb-account-create-update" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.247855 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" containerName="mariadb-account-create-update" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.248148 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" containerName="mariadb-account-create-update" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.249192 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.253717 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lm6d9" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.253998 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.260532 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-krlbd"] Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.373768 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.373837 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.373961 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnkw\" (UniqueName: \"kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.374033 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.475837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.476023 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.476079 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.476127 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psnkw\" (UniqueName: \"kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.483631 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.483656 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.484220 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.495316 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnkw\" (UniqueName: \"kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw\") pod \"glance-db-sync-krlbd\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:58 crc kubenswrapper[4912]: I0318 13:24:58.577226 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-krlbd" Mar 18 13:24:59 crc kubenswrapper[4912]: I0318 13:24:59.399979 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:24:59 crc kubenswrapper[4912]: E0318 13:24:59.400220 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:24:59 crc kubenswrapper[4912]: E0318 13:24:59.400956 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:24:59 crc kubenswrapper[4912]: E0318 13:24:59.401103 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:25:07.401068705 +0000 UTC m=+1355.860496130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:24:59 crc kubenswrapper[4912]: I0318 13:24:59.542852 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-krlbd"] Mar 18 13:24:59 crc kubenswrapper[4912]: I0318 13:24:59.918150 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt5tk" event={"ID":"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3","Type":"ContainerStarted","Data":"bc88e5aa51f6871a39dba30b400b276e46d991120d0f21d6375bae33fdf14431"} Mar 18 13:24:59 crc kubenswrapper[4912]: I0318 13:24:59.920380 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-krlbd" event={"ID":"a92c61dc-cfdf-4610-81b7-553c9882fc26","Type":"ContainerStarted","Data":"2bbe7984a5c43471ab5affb70afed900f611123c519ef430b2bc565ab20b6a99"} Mar 18 13:24:59 crc kubenswrapper[4912]: I0318 13:24:59.945821 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-kt5tk" podStartSLOduration=5.945797125 podStartE2EDuration="5.945797125s" podCreationTimestamp="2026-03-18 13:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:24:59.940460302 +0000 UTC m=+1348.399887727" watchObservedRunningTime="2026-03-18 13:24:59.945797125 +0000 UTC m=+1348.405224550" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.731759 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-cvs82"] Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.733974 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.743091 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-cvs82"] Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.789334 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.849409 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.849573 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2br\" (UniqueName: \"kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.861171 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.861458 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-z9qpx" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="dnsmasq-dns" containerID="cri-o://009c5d2eb5fac2ed6b862f524b15bdf3bfffba4642973c8bf24872407bdcfd2e" gracePeriod=10 Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.949475 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-28d7-account-create-update-f7776"] Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.950393 4912 generic.go:334] "Generic (PLEG): container finished" podID="8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" containerID="bc88e5aa51f6871a39dba30b400b276e46d991120d0f21d6375bae33fdf14431" exitCode=0 Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.951386 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt5tk" event={"ID":"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3","Type":"ContainerDied","Data":"bc88e5aa51f6871a39dba30b400b276e46d991120d0f21d6375bae33fdf14431"} Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.951514 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.954496 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.954706 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2br\" (UniqueName: \"kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.954922 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.956614 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.966027 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-28d7-account-create-update-f7776"] Mar 18 13:25:00 crc kubenswrapper[4912]: I0318 13:25:00.984203 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2br\" (UniqueName: \"kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br\") pod \"mysqld-exporter-openstack-cell1-db-create-cvs82\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.066149 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.066926 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.067003 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlpz\" (UniqueName: \"kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.171001 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.171120 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlpz\" (UniqueName: \"kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.172574 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.214117 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlpz\" (UniqueName: \"kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz\") pod \"mysqld-exporter-28d7-account-create-update-f7776\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.279187 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.575183 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-z9qpx" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.970325 4912 generic.go:334] "Generic (PLEG): container finished" podID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerID="009c5d2eb5fac2ed6b862f524b15bdf3bfffba4642973c8bf24872407bdcfd2e" exitCode=0 Mar 18 13:25:01 crc kubenswrapper[4912]: I0318 13:25:01.970410 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z9qpx" event={"ID":"a89dc2f9-7cb6-41c1-93ec-6204790a2b44","Type":"ContainerDied","Data":"009c5d2eb5fac2ed6b862f524b15bdf3bfffba4642973c8bf24872407bdcfd2e"} Mar 18 13:25:03 crc kubenswrapper[4912]: I0318 13:25:03.931372 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt5tk" Mar 18 13:25:03 crc kubenswrapper[4912]: I0318 13:25:03.997016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt5tk" event={"ID":"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3","Type":"ContainerDied","Data":"3fd550dbc93f008f2754403ab34ba07cead51e32b8c1e5d7aa45fa45938b6168"} Mar 18 13:25:03 crc kubenswrapper[4912]: I0318 13:25:03.997090 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd550dbc93f008f2754403ab34ba07cead51e32b8c1e5d7aa45fa45938b6168" Mar 18 13:25:03 crc kubenswrapper[4912]: I0318 13:25:03.997161 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt5tk" Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.062062 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts\") pod \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.062382 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8tr\" (UniqueName: \"kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr\") pod \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\" (UID: \"8ef2a27a-656e-400d-9bdb-9f7c26fa08e3\") " Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.063214 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" (UID: "8ef2a27a-656e-400d-9bdb-9f7c26fa08e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.063668 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.072292 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr" (OuterVolumeSpecName: "kube-api-access-xl8tr") pod "8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" (UID: "8ef2a27a-656e-400d-9bdb-9f7c26fa08e3"). InnerVolumeSpecName "kube-api-access-xl8tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:04 crc kubenswrapper[4912]: I0318 13:25:04.167575 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8tr\" (UniqueName: \"kubernetes.io/projected/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3-kube-api-access-xl8tr\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.649005 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.748490 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kt5tk"] Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.757844 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kt5tk"] Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.822010 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc\") pod \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.822243 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb\") pod \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.822442 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config\") pod \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.822804 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb\") pod \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.822991 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22wg\" (UniqueName: \"kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg\") pod \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\" (UID: \"a89dc2f9-7cb6-41c1-93ec-6204790a2b44\") " Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.830452 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg" (OuterVolumeSpecName: "kube-api-access-b22wg") pod "a89dc2f9-7cb6-41c1-93ec-6204790a2b44" (UID: "a89dc2f9-7cb6-41c1-93ec-6204790a2b44"). InnerVolumeSpecName "kube-api-access-b22wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.877595 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config" (OuterVolumeSpecName: "config") pod "a89dc2f9-7cb6-41c1-93ec-6204790a2b44" (UID: "a89dc2f9-7cb6-41c1-93ec-6204790a2b44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.878067 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a89dc2f9-7cb6-41c1-93ec-6204790a2b44" (UID: "a89dc2f9-7cb6-41c1-93ec-6204790a2b44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.880880 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a89dc2f9-7cb6-41c1-93ec-6204790a2b44" (UID: "a89dc2f9-7cb6-41c1-93ec-6204790a2b44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.883538 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a89dc2f9-7cb6-41c1-93ec-6204790a2b44" (UID: "a89dc2f9-7cb6-41c1-93ec-6204790a2b44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.926246 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.926287 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22wg\" (UniqueName: \"kubernetes.io/projected/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-kube-api-access-b22wg\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.926301 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.926312 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:05 crc kubenswrapper[4912]: I0318 13:25:05.926321 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89dc2f9-7cb6-41c1-93ec-6204790a2b44-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.022762 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z9qpx" event={"ID":"a89dc2f9-7cb6-41c1-93ec-6204790a2b44","Type":"ContainerDied","Data":"ea4af018898f14bd7b81ac11f6674ab11731bc45ac180fec7ca27f4a6441f4e8"} Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.022813 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z9qpx" Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.022850 4912 scope.go:117] "RemoveContainer" containerID="009c5d2eb5fac2ed6b862f524b15bdf3bfffba4642973c8bf24872407bdcfd2e" Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.025145 4912 generic.go:334] "Generic (PLEG): container finished" podID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerID="1d2a166924f5b4f3e488f2ea7f00fb05c68b56c294cea47dc73fb81918b8845b" exitCode=0 Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.025219 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerDied","Data":"1d2a166924f5b4f3e488f2ea7f00fb05c68b56c294cea47dc73fb81918b8845b"} Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.034214 4912 generic.go:334] "Generic (PLEG): container finished" podID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerID="89b9b0744f0defd6a923dad45fc73dbb7c753a1d60556a747054c707ca6490fd" exitCode=0 Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.034292 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerDied","Data":"89b9b0744f0defd6a923dad45fc73dbb7c753a1d60556a747054c707ca6490fd"} Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.037661 4912 generic.go:334] "Generic (PLEG): container finished" podID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerID="6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1" exitCode=0 Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.037719 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerDied","Data":"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1"} Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.041084 4912 generic.go:334] "Generic (PLEG): container finished" podID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerID="44d2c20bf933e8a8a31df1480a2142c6fd3cd3bffe7c9d96198e1527e8ce82b2" exitCode=0 Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.041125 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerDied","Data":"44d2c20bf933e8a8a31df1480a2142c6fd3cd3bffe7c9d96198e1527e8ce82b2"} Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.129881 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.144105 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z9qpx"] Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.245085 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" path="/var/lib/kubelet/pods/8ef2a27a-656e-400d-9bdb-9f7c26fa08e3/volumes" Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.245695 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" path="/var/lib/kubelet/pods/a89dc2f9-7cb6-41c1-93ec-6204790a2b44/volumes" Mar 18 13:25:06 crc kubenswrapper[4912]: I0318 13:25:06.900831 4912 scope.go:117] "RemoveContainer" containerID="5801ffc0cd0d73fbb0f23e8f6d53070d02b9ebf94757fef74cc501ba2a5593a6" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.000069 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.000133 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.000186 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.001582 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.001657 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9" gracePeriod=600 Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.146669 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-fccc4d7b-dngkq" podUID="ea48ad05-2840-485b-9aef-8477c33cf61b" containerName="console" containerID="cri-o://43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188" gracePeriod=15 Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.372324 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-cvs82"] Mar 18 13:25:07 crc kubenswrapper[4912]: W0318 13:25:07.385526 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f5291e_16e8_4832_925e_05e2e5406607.slice/crio-d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197 WatchSource:0}: Error finding container d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197: Status 404 returned error can't find the container with id d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197 Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.479938 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:25:07 crc kubenswrapper[4912]: E0318 13:25:07.480395 4912 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 13:25:07 crc kubenswrapper[4912]: E0318 13:25:07.480419 4912 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 13:25:07 crc kubenswrapper[4912]: E0318 13:25:07.480514 4912 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift podName:8f71e79a-72ad-4de7-9b24-7ac75884deae nodeName:}" failed. No retries permitted until 2026-03-18 13:25:23.480489696 +0000 UTC m=+1371.939917121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift") pod "swift-storage-0" (UID: "8f71e79a-72ad-4de7-9b24-7ac75884deae") : configmap "swift-ring-files" not found Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.531937 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-28d7-account-create-update-f7776"] Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.715196 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fccc4d7b-dngkq_ea48ad05-2840-485b-9aef-8477c33cf61b/console/0.log" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.715343 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.804621 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805257 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805349 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805508 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805637 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805667 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqfqq\" (UniqueName: \"kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.805732 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config\") pod \"ea48ad05-2840-485b-9aef-8477c33cf61b\" (UID: \"ea48ad05-2840-485b-9aef-8477c33cf61b\") " Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.807207 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.807614 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca" (OuterVolumeSpecName: "service-ca") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.807792 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.808207 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config" (OuterVolumeSpecName: "console-config") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.818619 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.819649 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq" (OuterVolumeSpecName: "kube-api-access-qqfqq") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "kube-api-access-qqfqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.837854 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ea48ad05-2840-485b-9aef-8477c33cf61b" (UID: "ea48ad05-2840-485b-9aef-8477c33cf61b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909094 4912 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909134 4912 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909144 4912 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909153 4912 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909163 4912 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea48ad05-2840-485b-9aef-8477c33cf61b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909171 4912 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea48ad05-2840-485b-9aef-8477c33cf61b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:07 crc kubenswrapper[4912]: I0318 13:25:07.909180 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqfqq\" (UniqueName: \"kubernetes.io/projected/ea48ad05-2840-485b-9aef-8477c33cf61b-kube-api-access-qqfqq\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.062784 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.121621 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cczr8" event={"ID":"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b","Type":"ContainerStarted","Data":"b8c372174855c6801fc5e7ed6c2d49c185f2de438b223c67a7c7017f8edad550"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.142532 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9" exitCode=0 Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.142765 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.142835 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.142902 4912 scope.go:117] "RemoveContainer" containerID="d8ccf0d59f4df315e6c70763e698a5db3ca57ea836d4bda55d4c5996a0aad5df" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.167626 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cczr8" podStartSLOduration=2.692860065 podStartE2EDuration="13.167600102s" podCreationTimestamp="2026-03-18 13:24:55 +0000 UTC" firstStartedPulling="2026-03-18 13:24:56.437904448 +0000 UTC m=+1344.897331873" lastFinishedPulling="2026-03-18 13:25:06.912644485 +0000 UTC m=+1355.372071910" observedRunningTime="2026-03-18 13:25:08.167383486 +0000 UTC m=+1356.626810911" watchObservedRunningTime="2026-03-18 13:25:08.167600102 +0000 UTC m=+1356.627027527" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.169084 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" event={"ID":"885ba47e-6a32-4a32-86c0-a6dbb63c33b0","Type":"ContainerStarted","Data":"6a218816a0a7c683763c056b96e0291cd8d36ab109896fc9fefabe9abaed5fb6"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.169147 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" event={"ID":"885ba47e-6a32-4a32-86c0-a6dbb63c33b0","Type":"ContainerStarted","Data":"a732346402f0e72df231ad29690d8f96f8d5b13cc870ca64d61caa25481e13df"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.198077 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fccc4d7b-dngkq_ea48ad05-2840-485b-9aef-8477c33cf61b/console/0.log" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.198170 4912 generic.go:334] "Generic (PLEG): container finished" podID="ea48ad05-2840-485b-9aef-8477c33cf61b" containerID="43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188" exitCode=2 Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.198376 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fccc4d7b-dngkq" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.200345 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fccc4d7b-dngkq" event={"ID":"ea48ad05-2840-485b-9aef-8477c33cf61b","Type":"ContainerDied","Data":"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.200564 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fccc4d7b-dngkq" event={"ID":"ea48ad05-2840-485b-9aef-8477c33cf61b","Type":"ContainerDied","Data":"586fda836b9031546f2e3eef89f09de12836e1b4ccc92bc2417aaa1edf2f05f8"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.212596 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerStarted","Data":"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.214271 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.222563 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerStarted","Data":"8d7e357063efa4771949307841a428e486507bdf5be05cfb41589e927c52092d"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.222786 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.258684 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.775883135 podStartE2EDuration="1m5.258654759s" podCreationTimestamp="2026-03-18 13:24:03 +0000 UTC" firstStartedPulling="2026-03-18 13:24:06.22406706 +0000 UTC m=+1294.683494485" lastFinishedPulling="2026-03-18 13:24:30.706838684 +0000 UTC m=+1319.166266109" observedRunningTime="2026-03-18 13:25:08.257318283 +0000 UTC m=+1356.716745718" watchObservedRunningTime="2026-03-18 13:25:08.258654759 +0000 UTC m=+1356.718082184" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.313627 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" event={"ID":"01f5291e-16e8-4832-925e-05e2e5406607","Type":"ContainerStarted","Data":"9efea96349526457502ef8478fd55e31eff36e99f3b44dc8234296e34c9d35f0"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314126 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" event={"ID":"01f5291e-16e8-4832-925e-05e2e5406607","Type":"ContainerStarted","Data":"d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314143 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerStarted","Data":"b15d4fb8c42327da3fbd061f47dd161f5f210106735fd6e8f75cc8b69d650bee"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314156 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerStarted","Data":"d1af359093d0e429a68984f9a01904c2313a0a170d01fbf3508cc8e1dd2fbcab"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314170 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerStarted","Data":"b13a18336ccf9bb2bab92af4c7b6dab70113fa9af0e67388845f718f1c98a197"} Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314531 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.314653 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.328683 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" podStartSLOduration=8.32865481 podStartE2EDuration="8.32865481s" podCreationTimestamp="2026-03-18 13:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:25:08.297313638 +0000 UTC m=+1356.756741083" watchObservedRunningTime="2026-03-18 13:25:08.32865481 +0000 UTC m=+1356.788082235" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.358553 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=40.661973233 podStartE2EDuration="1m5.358522903s" podCreationTimestamp="2026-03-18 13:24:03 +0000 UTC" firstStartedPulling="2026-03-18 13:24:06.050280619 +0000 UTC m=+1294.509708044" lastFinishedPulling="2026-03-18 13:24:30.746830289 +0000 UTC m=+1319.206257714" observedRunningTime="2026-03-18 13:25:08.350930279 +0000 UTC m=+1356.810357724" watchObservedRunningTime="2026-03-18 13:25:08.358522903 +0000 UTC m=+1356.817950338" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.400771 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.429304 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fccc4d7b-dngkq"] Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.445837 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.243453011 podStartE2EDuration="1m5.445805029s" podCreationTimestamp="2026-03-18 13:24:03 +0000 UTC" firstStartedPulling="2026-03-18 13:24:05.527465488 +0000 UTC m=+1293.986892913" lastFinishedPulling="2026-03-18 13:24:29.729817506 +0000 UTC m=+1318.189244931" observedRunningTime="2026-03-18 13:25:08.412295008 +0000 UTC m=+1356.871722453" watchObservedRunningTime="2026-03-18 13:25:08.445805029 +0000 UTC m=+1356.905232464" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.458852 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=40.992491107 podStartE2EDuration="1m5.458828159s" podCreationTimestamp="2026-03-18 13:24:03 +0000 UTC" firstStartedPulling="2026-03-18 13:24:06.224144552 +0000 UTC m=+1294.683571977" lastFinishedPulling="2026-03-18 13:24:30.690481604 +0000 UTC m=+1319.149909029" observedRunningTime="2026-03-18 13:25:08.45180966 +0000 UTC m=+1356.911237105" watchObservedRunningTime="2026-03-18 13:25:08.458828159 +0000 UTC m=+1356.918255584" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.564238 4912 scope.go:117] "RemoveContainer" containerID="43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.593655 4912 scope.go:117] "RemoveContainer" containerID="43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188" Mar 18 13:25:08 crc kubenswrapper[4912]: E0318 13:25:08.594882 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188\": container with ID starting with 43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188 not found: ID does not exist" containerID="43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188" Mar 18 13:25:08 crc kubenswrapper[4912]: I0318 13:25:08.594941 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188"} err="failed to get container status \"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188\": rpc error: code = NotFound desc = could not find container \"43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188\": container with ID starting with 43daa02bbd9598b31ab365a8c91d9518a1ac8fc500103b17745713ff285e6188 not found: ID does not exist" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.285566 4912 generic.go:334] "Generic (PLEG): container finished" podID="01f5291e-16e8-4832-925e-05e2e5406607" containerID="9efea96349526457502ef8478fd55e31eff36e99f3b44dc8234296e34c9d35f0" exitCode=0 Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.285621 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" event={"ID":"01f5291e-16e8-4832-925e-05e2e5406607","Type":"ContainerDied","Data":"9efea96349526457502ef8478fd55e31eff36e99f3b44dc8234296e34c9d35f0"} Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.293191 4912 generic.go:334] "Generic (PLEG): container finished" podID="885ba47e-6a32-4a32-86c0-a6dbb63c33b0" containerID="6a218816a0a7c683763c056b96e0291cd8d36ab109896fc9fefabe9abaed5fb6" exitCode=0 Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.293306 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" event={"ID":"885ba47e-6a32-4a32-86c0-a6dbb63c33b0","Type":"ContainerDied","Data":"6a218816a0a7c683763c056b96e0291cd8d36ab109896fc9fefabe9abaed5fb6"} Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.466500 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cmbw2"] Mar 18 13:25:09 crc kubenswrapper[4912]: E0318 13:25:09.467456 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea48ad05-2840-485b-9aef-8477c33cf61b" containerName="console" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467477 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea48ad05-2840-485b-9aef-8477c33cf61b" containerName="console" Mar 18 13:25:09 crc kubenswrapper[4912]: E0318 13:25:09.467487 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" containerName="mariadb-account-create-update" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467494 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" containerName="mariadb-account-create-update" Mar 18 13:25:09 crc kubenswrapper[4912]: E0318 13:25:09.467511 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="dnsmasq-dns" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467519 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="dnsmasq-dns" Mar 18 13:25:09 crc kubenswrapper[4912]: E0318 13:25:09.467528 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="init" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467534 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="init" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467740 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea48ad05-2840-485b-9aef-8477c33cf61b" containerName="console" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467760 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89dc2f9-7cb6-41c1-93ec-6204790a2b44" containerName="dnsmasq-dns" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.467772 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef2a27a-656e-400d-9bdb-9f7c26fa08e3" containerName="mariadb-account-create-update" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.468566 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.471762 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.479232 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cmbw2"] Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.582336 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqkl\" (UniqueName: \"kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.582857 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.684937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqkl\" (UniqueName: \"kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.685066 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.686271 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.724935 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqkl\" (UniqueName: \"kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl\") pod \"root-account-create-update-cmbw2\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:09 crc kubenswrapper[4912]: I0318 13:25:09.803566 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:10 crc kubenswrapper[4912]: I0318 13:25:10.250732 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea48ad05-2840-485b-9aef-8477c33cf61b" path="/var/lib/kubelet/pods/ea48ad05-2840-485b-9aef-8477c33cf61b/volumes" Mar 18 13:25:10 crc kubenswrapper[4912]: I0318 13:25:10.358861 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cmbw2"] Mar 18 13:25:10 crc kubenswrapper[4912]: W0318 13:25:10.448656 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8869339b_65c0_43fd_b129_600907565615.slice/crio-156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56 WatchSource:0}: Error finding container 156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56: Status 404 returned error can't find the container with id 156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56 Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.287451 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.294443 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.352054 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" event={"ID":"885ba47e-6a32-4a32-86c0-a6dbb63c33b0","Type":"ContainerDied","Data":"a732346402f0e72df231ad29690d8f96f8d5b13cc870ca64d61caa25481e13df"} Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.352110 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a732346402f0e72df231ad29690d8f96f8d5b13cc870ca64d61caa25481e13df" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.352184 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-28d7-account-create-update-f7776" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.356586 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cmbw2" event={"ID":"8869339b-65c0-43fd-b129-600907565615","Type":"ContainerStarted","Data":"8ced65959a177d85fdc3b18853e63692148431992035e9476eb763c588e01c9b"} Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.356641 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cmbw2" event={"ID":"8869339b-65c0-43fd-b129-600907565615","Type":"ContainerStarted","Data":"156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56"} Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.362597 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" event={"ID":"01f5291e-16e8-4832-925e-05e2e5406607","Type":"ContainerDied","Data":"d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197"} Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.362669 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c965b34252000f6ec5cc4c6ff1c509634a8e606377b5fbd49a43241e069197" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.362759 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-cvs82" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.407659 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-cmbw2" podStartSLOduration=2.407627491 podStartE2EDuration="2.407627491s" podCreationTimestamp="2026-03-18 13:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:25:11.382559767 +0000 UTC m=+1359.841987192" watchObservedRunningTime="2026-03-18 13:25:11.407627491 +0000 UTC m=+1359.867054916" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.439527 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts\") pod \"01f5291e-16e8-4832-925e-05e2e5406607\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.439757 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlpz\" (UniqueName: \"kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz\") pod \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.439865 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt2br\" (UniqueName: \"kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br\") pod \"01f5291e-16e8-4832-925e-05e2e5406607\" (UID: \"01f5291e-16e8-4832-925e-05e2e5406607\") " Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.439994 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts\") pod \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\" (UID: \"885ba47e-6a32-4a32-86c0-a6dbb63c33b0\") " Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.440968 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01f5291e-16e8-4832-925e-05e2e5406607" (UID: "01f5291e-16e8-4832-925e-05e2e5406607"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.442089 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "885ba47e-6a32-4a32-86c0-a6dbb63c33b0" (UID: "885ba47e-6a32-4a32-86c0-a6dbb63c33b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.465797 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz" (OuterVolumeSpecName: "kube-api-access-7hlpz") pod "885ba47e-6a32-4a32-86c0-a6dbb63c33b0" (UID: "885ba47e-6a32-4a32-86c0-a6dbb63c33b0"). InnerVolumeSpecName "kube-api-access-7hlpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.475973 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br" (OuterVolumeSpecName: "kube-api-access-gt2br") pod "01f5291e-16e8-4832-925e-05e2e5406607" (UID: "01f5291e-16e8-4832-925e-05e2e5406607"). InnerVolumeSpecName "kube-api-access-gt2br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.545152 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f5291e-16e8-4832-925e-05e2e5406607-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.545205 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt2br\" (UniqueName: \"kubernetes.io/projected/01f5291e-16e8-4832-925e-05e2e5406607-kube-api-access-gt2br\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.545224 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hlpz\" (UniqueName: \"kubernetes.io/projected/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-kube-api-access-7hlpz\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:11 crc kubenswrapper[4912]: I0318 13:25:11.545240 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ba47e-6a32-4a32-86c0-a6dbb63c33b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.409456 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerStarted","Data":"a8d9f4d281be291c4edb28b0d4e59e0e7d084e8a8634af5b5794d113bd83311f"} Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.416144 4912 generic.go:334] "Generic (PLEG): container finished" podID="8869339b-65c0-43fd-b129-600907565615" containerID="8ced65959a177d85fdc3b18853e63692148431992035e9476eb763c588e01c9b" exitCode=0 Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.416208 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cmbw2" event={"ID":"8869339b-65c0-43fd-b129-600907565615","Type":"ContainerDied","Data":"8ced65959a177d85fdc3b18853e63692148431992035e9476eb763c588e01c9b"} Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.693475 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7nd97" podUID="5353be6e-99f8-4367-a237-99e0bd3bab04" containerName="ovn-controller" probeResult="failure" output=< Mar 18 13:25:12 crc kubenswrapper[4912]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 13:25:12 crc kubenswrapper[4912]: > Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.777404 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:25:12 crc kubenswrapper[4912]: I0318 13:25:12.785911 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tbb6v" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.167803 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7nd97-config-kb5zr"] Mar 18 13:25:13 crc kubenswrapper[4912]: E0318 13:25:13.168670 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885ba47e-6a32-4a32-86c0-a6dbb63c33b0" containerName="mariadb-account-create-update" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.168690 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="885ba47e-6a32-4a32-86c0-a6dbb63c33b0" containerName="mariadb-account-create-update" Mar 18 13:25:13 crc kubenswrapper[4912]: E0318 13:25:13.168711 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f5291e-16e8-4832-925e-05e2e5406607" containerName="mariadb-database-create" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.168719 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f5291e-16e8-4832-925e-05e2e5406607" containerName="mariadb-database-create" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.168940 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f5291e-16e8-4832-925e-05e2e5406607" containerName="mariadb-database-create" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.168964 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="885ba47e-6a32-4a32-86c0-a6dbb63c33b0" containerName="mariadb-account-create-update" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.177172 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.179848 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.182856 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97-config-kb5zr"] Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.292903 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.293017 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.293058 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.293110 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.293154 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.293190 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvld\" (UniqueName: \"kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.395891 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396015 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396070 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396149 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396212 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396245 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvld\" (UniqueName: \"kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396527 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.396634 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.397357 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.397429 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.398517 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.464994 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvld\" (UniqueName: \"kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld\") pod \"ovn-controller-7nd97-config-kb5zr\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:13 crc kubenswrapper[4912]: I0318 13:25:13.512902 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.062563 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.072352 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.075704 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.094579 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.110185 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.110319 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs8dd\" (UniqueName: \"kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.110425 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.213720 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.214197 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs8dd\" (UniqueName: \"kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.214397 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.230303 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.243136 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.253813 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs8dd\" (UniqueName: \"kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd\") pod \"mysqld-exporter-0\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " pod="openstack/mysqld-exporter-0" Mar 18 13:25:16 crc kubenswrapper[4912]: I0318 13:25:16.412017 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:25:17 crc kubenswrapper[4912]: I0318 13:25:17.676359 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7nd97" podUID="5353be6e-99f8-4367-a237-99e0bd3bab04" containerName="ovn-controller" probeResult="failure" output=< Mar 18 13:25:17 crc kubenswrapper[4912]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 13:25:17 crc kubenswrapper[4912]: > Mar 18 13:25:18 crc kubenswrapper[4912]: I0318 13:25:18.486241 4912 generic.go:334] "Generic (PLEG): container finished" podID="4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" containerID="b8c372174855c6801fc5e7ed6c2d49c185f2de438b223c67a7c7017f8edad550" exitCode=0 Mar 18 13:25:18 crc kubenswrapper[4912]: I0318 13:25:18.486342 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cczr8" event={"ID":"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b","Type":"ContainerDied","Data":"b8c372174855c6801fc5e7ed6c2d49c185f2de438b223c67a7c7017f8edad550"} Mar 18 13:25:22 crc kubenswrapper[4912]: I0318 13:25:22.674483 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7nd97" podUID="5353be6e-99f8-4367-a237-99e0bd3bab04" containerName="ovn-controller" probeResult="failure" output=< Mar 18 13:25:22 crc kubenswrapper[4912]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 13:25:22 crc kubenswrapper[4912]: > Mar 18 13:25:22 crc kubenswrapper[4912]: I0318 13:25:22.994129 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:25:22 crc kubenswrapper[4912]: I0318 13:25:22.995950 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:23 crc kubenswrapper[4912]: E0318 13:25:23.022589 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 18 13:25:23 crc kubenswrapper[4912]: E0318 13:25:23.022767 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psnkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-krlbd_openstack(a92c61dc-cfdf-4610-81b7-553c9882fc26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:25:23 crc kubenswrapper[4912]: E0318 13:25:23.024097 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-krlbd" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.098838 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.098973 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099089 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cpn\" (UniqueName: \"kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099179 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099291 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099435 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvqkl\" (UniqueName: \"kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl\") pod \"8869339b-65c0-43fd-b129-600907565615\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099572 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099668 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts\") pod \"8869339b-65c0-43fd-b129-600907565615\" (UID: \"8869339b-65c0-43fd-b129-600907565615\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.099747 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf\") pod \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\" (UID: \"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b\") " Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.103393 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8869339b-65c0-43fd-b129-600907565615" (UID: "8869339b-65c0-43fd-b129-600907565615"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.105154 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.107530 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.108394 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn" (OuterVolumeSpecName: "kube-api-access-d4cpn") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "kube-api-access-d4cpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.112208 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.113929 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl" (OuterVolumeSpecName: "kube-api-access-rvqkl") pod "8869339b-65c0-43fd-b129-600907565615" (UID: "8869339b-65c0-43fd-b129-600907565615"). InnerVolumeSpecName "kube-api-access-rvqkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.133924 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts" (OuterVolumeSpecName: "scripts") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.160102 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.187171 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" (UID: "4b02fe29-bc51-4fc4-86e7-44fb75e20e2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203136 4912 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203171 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203182 4912 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203192 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cpn\" (UniqueName: \"kubernetes.io/projected/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-kube-api-access-d4cpn\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203205 4912 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203213 4912 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203224 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvqkl\" (UniqueName: \"kubernetes.io/projected/8869339b-65c0-43fd-b129-600907565615-kube-api-access-rvqkl\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203236 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b02fe29-bc51-4fc4-86e7-44fb75e20e2b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.203245 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8869339b-65c0-43fd-b129-600907565615-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.304169 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97-config-kb5zr"] Mar 18 13:25:23 crc kubenswrapper[4912]: W0318 13:25:23.400089 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d2d27c_ebb6_4e31_ad64_1fc4e5bace02.slice/crio-d1a7d1f86a09f6df82db53112361085de1314c0694a5310e08cb8027de7644fa WatchSource:0}: Error finding container d1a7d1f86a09f6df82db53112361085de1314c0694a5310e08cb8027de7644fa: Status 404 returned error can't find the container with id d1a7d1f86a09f6df82db53112361085de1314c0694a5310e08cb8027de7644fa Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.402593 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.513831 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.521378 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f71e79a-72ad-4de7-9b24-7ac75884deae-etc-swift\") pod \"swift-storage-0\" (UID: \"8f71e79a-72ad-4de7-9b24-7ac75884deae\") " pod="openstack/swift-storage-0" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.550893 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cmbw2" event={"ID":"8869339b-65c0-43fd-b129-600907565615","Type":"ContainerDied","Data":"156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56"} Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.550961 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="156bfdcf58e23defdc1b8c42cd287b86d2776c87e6ba71c81fb107ecdbbbff56" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.550955 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cmbw2" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.562486 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02","Type":"ContainerStarted","Data":"d1a7d1f86a09f6df82db53112361085de1314c0694a5310e08cb8027de7644fa"} Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.565065 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cczr8" event={"ID":"4b02fe29-bc51-4fc4-86e7-44fb75e20e2b","Type":"ContainerDied","Data":"5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198"} Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.565092 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cczr8" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.565123 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db5e45b0c6726adf4756c3c72ec8866e8e896e11bc7ce59783cc73822374198" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.567926 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-kb5zr" event={"ID":"c754b8c7-2990-43b6-a5e5-5b6b667b27b5","Type":"ContainerStarted","Data":"a230150f6b63d7ba22fd1eaf0bdb747da7a706023b7830cd446a86368c106753"} Mar 18 13:25:23 crc kubenswrapper[4912]: E0318 13:25:23.568695 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-krlbd" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" Mar 18 13:25:23 crc kubenswrapper[4912]: I0318 13:25:23.592423 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.206490 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 13:25:24 crc kubenswrapper[4912]: W0318 13:25:24.234722 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f71e79a_72ad_4de7_9b24_7ac75884deae.slice/crio-9d34d055b8739515df718986c298bb2ef826c3429f1f6ecfc7c811252ff59b21 WatchSource:0}: Error finding container 9d34d055b8739515df718986c298bb2ef826c3429f1f6ecfc7c811252ff59b21: Status 404 returned error can't find the container with id 9d34d055b8739515df718986c298bb2ef826c3429f1f6ecfc7c811252ff59b21 Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.592261 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"9d34d055b8739515df718986c298bb2ef826c3429f1f6ecfc7c811252ff59b21"} Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.595237 4912 generic.go:334] "Generic (PLEG): container finished" podID="c754b8c7-2990-43b6-a5e5-5b6b667b27b5" containerID="3e67950d5f8f5d50765891297796ce57b13665db958e646afebba87c1a7856a0" exitCode=0 Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.595502 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-kb5zr" event={"ID":"c754b8c7-2990-43b6-a5e5-5b6b667b27b5","Type":"ContainerDied","Data":"3e67950d5f8f5d50765891297796ce57b13665db958e646afebba87c1a7856a0"} Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.798364 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:25:24 crc kubenswrapper[4912]: I0318 13:25:24.998078 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 18 13:25:25 crc kubenswrapper[4912]: I0318 13:25:25.024889 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 18 13:25:25 crc kubenswrapper[4912]: I0318 13:25:25.037031 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 18 13:25:25 crc kubenswrapper[4912]: I0318 13:25:25.793002 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cmbw2"] Mar 18 13:25:25 crc kubenswrapper[4912]: I0318 13:25:25.805818 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cmbw2"] Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.246993 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8869339b-65c0-43fd-b129-600907565615" path="/var/lib/kubelet/pods/8869339b-65c0-43fd-b129-600907565615/volumes" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.457443 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.606883 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607064 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607205 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607261 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607291 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607323 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsvld\" (UniqueName: \"kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld\") pod \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\" (UID: \"c754b8c7-2990-43b6-a5e5-5b6b667b27b5\") " Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607521 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.607597 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run" (OuterVolumeSpecName: "var-run") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.611302 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts" (OuterVolumeSpecName: "scripts") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.611280 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.615510 4912 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.615634 4912 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.630655 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.654480 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld" (OuterVolumeSpecName: "kube-api-access-rsvld") pod "c754b8c7-2990-43b6-a5e5-5b6b667b27b5" (UID: "c754b8c7-2990-43b6-a5e5-5b6b667b27b5"). InnerVolumeSpecName "kube-api-access-rsvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.659136 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-kb5zr" event={"ID":"c754b8c7-2990-43b6-a5e5-5b6b667b27b5","Type":"ContainerDied","Data":"a230150f6b63d7ba22fd1eaf0bdb747da7a706023b7830cd446a86368c106753"} Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.659295 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a230150f6b63d7ba22fd1eaf0bdb747da7a706023b7830cd446a86368c106753" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.659439 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-kb5zr" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.717760 4912 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.717814 4912 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.717829 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:26 crc kubenswrapper[4912]: I0318 13:25:26.717840 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsvld\" (UniqueName: \"kubernetes.io/projected/c754b8c7-2990-43b6-a5e5-5b6b667b27b5-kube-api-access-rsvld\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.603529 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7nd97-config-kb5zr"] Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.634971 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7nd97-config-kb5zr"] Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.709440 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerStarted","Data":"64c86d6e3878ff47f0697a7a5cbfc14a1eeee8ed2bf4671f91bc03cc3dbea0cd"} Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.735634 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"5614ad18157cda41b04ec19626898bc0c8d31aa3b18c6ff10fd7ab0d1e6f804d"} Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.735692 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"af6f5feddc0d6fe571213ed84bf1ee271c1d85ff12add8932a1f785021a7d9e5"} Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.748882 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02","Type":"ContainerStarted","Data":"b9f591b5c7fa3912020ec3274b317aff7311520a8700480dbb8179b9d2e81c48"} Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.796454 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.832975744 podStartE2EDuration="1m17.796422452s" podCreationTimestamp="2026-03-18 13:24:10 +0000 UTC" firstStartedPulling="2026-03-18 13:24:29.721205624 +0000 UTC m=+1318.180633059" lastFinishedPulling="2026-03-18 13:25:26.684652342 +0000 UTC m=+1375.144079767" observedRunningTime="2026-03-18 13:25:27.775026158 +0000 UTC m=+1376.234453593" watchObservedRunningTime="2026-03-18 13:25:27.796422452 +0000 UTC m=+1376.255849877" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.844114 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=8.219450366 podStartE2EDuration="11.844092542s" podCreationTimestamp="2026-03-18 13:25:16 +0000 UTC" firstStartedPulling="2026-03-18 13:25:23.403577167 +0000 UTC m=+1371.863004592" lastFinishedPulling="2026-03-18 13:25:27.028219343 +0000 UTC m=+1375.487646768" observedRunningTime="2026-03-18 13:25:27.842247872 +0000 UTC m=+1376.301675297" watchObservedRunningTime="2026-03-18 13:25:27.844092542 +0000 UTC m=+1376.303519967" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.901747 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7nd97" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.930144 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7nd97-config-9fxbp"] Mar 18 13:25:27 crc kubenswrapper[4912]: E0318 13:25:27.930797 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c754b8c7-2990-43b6-a5e5-5b6b667b27b5" containerName="ovn-config" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.930814 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c754b8c7-2990-43b6-a5e5-5b6b667b27b5" containerName="ovn-config" Mar 18 13:25:27 crc kubenswrapper[4912]: E0318 13:25:27.930840 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" containerName="swift-ring-rebalance" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.930848 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" containerName="swift-ring-rebalance" Mar 18 13:25:27 crc kubenswrapper[4912]: E0318 13:25:27.930866 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8869339b-65c0-43fd-b129-600907565615" containerName="mariadb-account-create-update" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.930874 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8869339b-65c0-43fd-b129-600907565615" containerName="mariadb-account-create-update" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.931173 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8869339b-65c0-43fd-b129-600907565615" containerName="mariadb-account-create-update" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.931208 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b02fe29-bc51-4fc4-86e7-44fb75e20e2b" containerName="swift-ring-rebalance" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.931221 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c754b8c7-2990-43b6-a5e5-5b6b667b27b5" containerName="ovn-config" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.932148 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.942841 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 13:25:27 crc kubenswrapper[4912]: I0318 13:25:27.975438 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97-config-9fxbp"] Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066187 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066253 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066273 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066322 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066349 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4tj\" (UniqueName: \"kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.066439 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168453 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168591 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168646 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168671 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168737 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.168773 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4tj\" (UniqueName: \"kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.169501 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.169614 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.169669 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.170189 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.171503 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.193445 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4tj\" (UniqueName: \"kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj\") pod \"ovn-controller-7nd97-config-9fxbp\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.243710 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c754b8c7-2990-43b6-a5e5-5b6b667b27b5" path="/var/lib/kubelet/pods/c754b8c7-2990-43b6-a5e5-5b6b667b27b5/volumes" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.323737 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.768672 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"e7c7547872df38410fdd8912a7ebb84da0d394de76538717dd72fd25ece29438"} Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.769283 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"5b95fed1dd6252abd5d91cdd62a60f4d5cb44c3e65356c7786b13ab145f6b291"} Mar 18 13:25:28 crc kubenswrapper[4912]: I0318 13:25:28.923466 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7nd97-config-9fxbp"] Mar 18 13:25:29 crc kubenswrapper[4912]: I0318 13:25:29.792114 4912 generic.go:334] "Generic (PLEG): container finished" podID="12b67fd1-7452-4928-9e15-2a1e780b14bb" containerID="93dad7287a0f64e9f789b5e069319da6913ae99cbcb2dc29d179e8dca5505371" exitCode=0 Mar 18 13:25:29 crc kubenswrapper[4912]: I0318 13:25:29.792180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-9fxbp" event={"ID":"12b67fd1-7452-4928-9e15-2a1e780b14bb","Type":"ContainerDied","Data":"93dad7287a0f64e9f789b5e069319da6913ae99cbcb2dc29d179e8dca5505371"} Mar 18 13:25:29 crc kubenswrapper[4912]: I0318 13:25:29.793005 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-9fxbp" event={"ID":"12b67fd1-7452-4928-9e15-2a1e780b14bb","Type":"ContainerStarted","Data":"aa8ffa51d67553ee144a1f86b51abd5997883d9abb033ca7d270a0f1aec01cca"} Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.783830 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q2jgz"] Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.786801 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.791949 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.796444 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q2jgz"] Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.819545 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"57be44cd8e6a3c70fb3c9d1078a21b582f2c4c7458509230fd48ef373fa5f106"} Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.819613 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"106aee74f35271e75e9648a8fa08935f01e076c4d72a6b5609c925195c614e57"} Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.819628 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"e92fcd26b39bd991926e495d4a221af59c13d28606eb42d48d8bf1df149f0d09"} Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.819647 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"392ce8a583f591a5c7c1ea34a98f546622c3fe8c108340983be29e53db419985"} Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.851884 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.852101 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjzv\" (UniqueName: \"kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.956649 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.956940 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjzv\" (UniqueName: \"kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:30 crc kubenswrapper[4912]: I0318 13:25:30.958917 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:30.995275 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjzv\" (UniqueName: \"kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv\") pod \"root-account-create-update-q2jgz\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.121868 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.293583 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371080 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371254 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4tj\" (UniqueName: \"kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371386 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371483 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371523 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.371625 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn\") pod \"12b67fd1-7452-4928-9e15-2a1e780b14bb\" (UID: \"12b67fd1-7452-4928-9e15-2a1e780b14bb\") " Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.372236 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.372212 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run" (OuterVolumeSpecName: "var-run") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.372299 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.373314 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.373671 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts" (OuterVolumeSpecName: "scripts") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.375226 4912 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.375316 4912 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.375333 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.375346 4912 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12b67fd1-7452-4928-9e15-2a1e780b14bb-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.375361 4912 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12b67fd1-7452-4928-9e15-2a1e780b14bb-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.387399 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj" (OuterVolumeSpecName: "kube-api-access-qd4tj") pod "12b67fd1-7452-4928-9e15-2a1e780b14bb" (UID: "12b67fd1-7452-4928-9e15-2a1e780b14bb"). InnerVolumeSpecName "kube-api-access-qd4tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.477222 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4tj\" (UniqueName: \"kubernetes.io/projected/12b67fd1-7452-4928-9e15-2a1e780b14bb-kube-api-access-qd4tj\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.711129 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q2jgz"] Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.785257 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.834983 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7nd97-config-9fxbp" event={"ID":"12b67fd1-7452-4928-9e15-2a1e780b14bb","Type":"ContainerDied","Data":"aa8ffa51d67553ee144a1f86b51abd5997883d9abb033ca7d270a0f1aec01cca"} Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.835072 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7nd97-config-9fxbp" Mar 18 13:25:31 crc kubenswrapper[4912]: I0318 13:25:31.835119 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa8ffa51d67553ee144a1f86b51abd5997883d9abb033ca7d270a0f1aec01cca" Mar 18 13:25:31 crc kubenswrapper[4912]: W0318 13:25:31.972341 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb13f5d_2aaa_4efb_b7a5_4e3a477d15f2.slice/crio-e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372 WatchSource:0}: Error finding container e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372: Status 404 returned error can't find the container with id e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372 Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.462177 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7nd97-config-9fxbp"] Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.468800 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7nd97-config-9fxbp"] Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.850137 4912 generic.go:334] "Generic (PLEG): container finished" podID="4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" containerID="2708cfeb2aa827f9b8643cd77458cf9445120d7378b999ce6dd06aeb73d28516" exitCode=0 Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.850245 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2jgz" event={"ID":"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2","Type":"ContainerDied","Data":"2708cfeb2aa827f9b8643cd77458cf9445120d7378b999ce6dd06aeb73d28516"} Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.850697 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2jgz" event={"ID":"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2","Type":"ContainerStarted","Data":"e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372"} Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.857283 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"97b19227386dc68088d9c94721e0908166c7e0df02f3f39cb917f2d99fe0142b"} Mar 18 13:25:32 crc kubenswrapper[4912]: I0318 13:25:32.857347 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"0ce60f0373af6c679f20cf23b7c5075dc047be00c1e92b31001bce0143c81076"} Mar 18 13:25:33 crc kubenswrapper[4912]: I0318 13:25:33.879149 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"b75a740485122af301221606c5ff1503cb2e49de1e15656c8152a301a06182af"} Mar 18 13:25:33 crc kubenswrapper[4912]: I0318 13:25:33.879581 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"4ac96c3360bed1ea6de421022e8e4991897c375b285a82fba32f7f95f1e42907"} Mar 18 13:25:33 crc kubenswrapper[4912]: I0318 13:25:33.879599 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"22ead0bddf8b92a98f99d683ae8f9e1bac3a5d11b62c8914a1fe3cb74e3ea8c4"} Mar 18 13:25:33 crc kubenswrapper[4912]: I0318 13:25:33.879613 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"7e0c414d052ad26c50f5c24305aa103b8d34d7e3a6771dae42f8a56605596412"} Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.241972 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b67fd1-7452-4928-9e15-2a1e780b14bb" path="/var/lib/kubelet/pods/12b67fd1-7452-4928-9e15-2a1e780b14bb/volumes" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.398644 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.470794 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjzv\" (UniqueName: \"kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv\") pod \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.470950 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts\") pod \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\" (UID: \"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2\") " Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.472456 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" (UID: "4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.478273 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv" (OuterVolumeSpecName: "kube-api-access-rsjzv") pod "4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" (UID: "4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2"). InnerVolumeSpecName "kube-api-access-rsjzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.574857 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsjzv\" (UniqueName: \"kubernetes.io/projected/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-kube-api-access-rsjzv\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.574900 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.891862 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2jgz" event={"ID":"4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2","Type":"ContainerDied","Data":"e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372"} Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.892353 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6fc0ad4652cae120842e0167bb64fde8e59b0aa8b5fe60833ce9240dcf1a372" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.891884 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2jgz" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.901026 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8f71e79a-72ad-4de7-9b24-7ac75884deae","Type":"ContainerStarted","Data":"66bbb0262b09c6e7fcb40d67062529155fbf762463bf3612f026b021efea20c4"} Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.944991 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.16819633 podStartE2EDuration="44.944957241s" podCreationTimestamp="2026-03-18 13:24:50 +0000 UTC" firstStartedPulling="2026-03-18 13:25:24.238772444 +0000 UTC m=+1372.698199869" lastFinishedPulling="2026-03-18 13:25:32.015533355 +0000 UTC m=+1380.474960780" observedRunningTime="2026-03-18 13:25:34.932833086 +0000 UTC m=+1383.392260531" watchObservedRunningTime="2026-03-18 13:25:34.944957241 +0000 UTC m=+1383.404384666" Mar 18 13:25:34 crc kubenswrapper[4912]: I0318 13:25:34.997356 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.020329 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.092642 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.374091 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:25:35 crc kubenswrapper[4912]: E0318 13:25:35.374719 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b67fd1-7452-4928-9e15-2a1e780b14bb" containerName="ovn-config" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.374732 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b67fd1-7452-4928-9e15-2a1e780b14bb" containerName="ovn-config" Mar 18 13:25:35 crc kubenswrapper[4912]: E0318 13:25:35.374751 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" containerName="mariadb-account-create-update" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.374757 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" containerName="mariadb-account-create-update" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.375004 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" containerName="mariadb-account-create-update" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.375016 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b67fd1-7452-4928-9e15-2a1e780b14bb" containerName="ovn-config" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.376468 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.378805 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.390317 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.513902 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.513965 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.514012 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.514129 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwnw\" (UniqueName: \"kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.514365 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.514647 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.616886 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.616937 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.616985 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.617017 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwnw\" (UniqueName: \"kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.617096 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.617197 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.618024 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.618029 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.618371 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.618450 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.618968 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.643063 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwnw\" (UniqueName: \"kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw\") pod \"dnsmasq-dns-5c79d794d7-s6l7z\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:35 crc kubenswrapper[4912]: I0318 13:25:35.710091 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:36 crc kubenswrapper[4912]: I0318 13:25:36.247738 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:25:36 crc kubenswrapper[4912]: I0318 13:25:36.951302 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" event={"ID":"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452","Type":"ContainerDied","Data":"e15750ff6eda1e7fdefa6f350be058703d85dff7c6dbea0054fd53ad7c049615"} Mar 18 13:25:36 crc kubenswrapper[4912]: I0318 13:25:36.951248 4912 generic.go:334] "Generic (PLEG): container finished" podID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerID="e15750ff6eda1e7fdefa6f350be058703d85dff7c6dbea0054fd53ad7c049615" exitCode=0 Mar 18 13:25:36 crc kubenswrapper[4912]: I0318 13:25:36.952280 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" event={"ID":"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452","Type":"ContainerStarted","Data":"ddcb124e0d1b6d9117075252783c0755546a3d1d65275ec222072e4d11708200"} Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.700324 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rbnsb"] Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.702364 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.755340 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rbnsb"] Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.893471 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbq7k\" (UniqueName: \"kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.893703 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.960251 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-52ed-account-create-update-d48pr"] Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.966240 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.978576 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-52ed-account-create-update-d48pr"] Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.978938 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.995743 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbq7k\" (UniqueName: \"kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.996684 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:37 crc kubenswrapper[4912]: I0318 13:25:37.997932 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.053857 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-skzn7"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.068557 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-krlbd" event={"ID":"a92c61dc-cfdf-4610-81b7-553c9882fc26","Type":"ContainerStarted","Data":"7b3f3e459c9334390333b26ef9e0b392b8bf6f6ab79f21810221ba342e4bc120"} Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.068829 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.085147 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbq7k\" (UniqueName: \"kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k\") pod \"cinder-db-create-rbnsb\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.092214 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" event={"ID":"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452","Type":"ContainerStarted","Data":"fe4129bd358e1716fa75b4bc44fa8f135256f852b8387ebeb8d3fbc096d30410"} Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.093323 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.102369 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxgv\" (UniqueName: \"kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.102469 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.103251 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-skzn7"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.138246 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-55c0-account-create-update-dg7pc"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.139760 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.149123 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-55c0-account-create-update-dg7pc"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.160683 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.196193 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dhhd8"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.204622 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.211063 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxgv\" (UniqueName: \"kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.211246 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.211365 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.215236 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dhhd8"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.218610 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.219021 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knn24" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.219536 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.220415 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.220474 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbc8n\" (UniqueName: \"kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.220680 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.230747 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-krlbd" podStartSLOduration=2.982432095 podStartE2EDuration="40.230723692s" podCreationTimestamp="2026-03-18 13:24:58 +0000 UTC" firstStartedPulling="2026-03-18 13:24:59.559269407 +0000 UTC m=+1348.018696832" lastFinishedPulling="2026-03-18 13:25:36.807560984 +0000 UTC m=+1385.266988429" observedRunningTime="2026-03-18 13:25:38.165151022 +0000 UTC m=+1386.624578447" watchObservedRunningTime="2026-03-18 13:25:38.230723692 +0000 UTC m=+1386.690151117" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.266185 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxgv\" (UniqueName: \"kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv\") pod \"cinder-52ed-account-create-update-d48pr\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.305324 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ebf9-account-create-update-8tl2r"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.307111 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.310918 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.312097 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.322130 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podStartSLOduration=3.322093815 podStartE2EDuration="3.322093815s" podCreationTimestamp="2026-03-18 13:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:25:38.219894931 +0000 UTC m=+1386.679322366" watchObservedRunningTime="2026-03-18 13:25:38.322093815 +0000 UTC m=+1386.781521240" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.322943 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbc8n\" (UniqueName: \"kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323209 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323288 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323314 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323393 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rmq\" (UniqueName: \"kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.323450 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.325635 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.347957 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ebf9-account-create-update-8tl2r"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.377959 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.381697 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbc8n\" (UniqueName: \"kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n\") pod \"heat-db-create-skzn7\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431211 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5rmq\" (UniqueName: \"kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431488 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431536 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431659 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431690 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4p6\" (UniqueName: \"kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.431722 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.432743 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.442265 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-de81-account-create-update-mxs2s"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.444741 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.448562 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.457979 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.458770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.460548 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-de81-account-create-update-mxs2s"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.472867 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6\") pod \"heat-55c0-account-create-update-dg7pc\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.474783 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-skzn7" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.481579 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5rmq\" (UniqueName: \"kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq\") pod \"keystone-db-sync-dhhd8\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.494664 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8cwkc"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.498999 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.525339 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534272 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4p6\" (UniqueName: \"kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534362 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534378 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534421 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534499 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6mr\" (UniqueName: \"kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534523 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.534561 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6sv9\" (UniqueName: \"kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.535629 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.545819 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8cwkc"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.578973 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4p6\" (UniqueName: \"kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6\") pod \"barbican-ebf9-account-create-update-8tl2r\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.605305 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xpx4j"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.607363 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.637238 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.637369 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6mr\" (UniqueName: \"kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.637404 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.637452 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6sv9\" (UniqueName: \"kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.638245 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.638549 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.639554 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.661425 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xpx4j"] Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.688090 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6mr\" (UniqueName: \"kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr\") pod \"barbican-db-create-8cwkc\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.697764 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6sv9\" (UniqueName: \"kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9\") pod \"neutron-de81-account-create-update-mxs2s\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.744943 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.745640 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfz7k\" (UniqueName: \"kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.787784 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.826546 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.859568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.859747 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfz7k\" (UniqueName: \"kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.861066 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.921438 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfz7k\" (UniqueName: \"kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k\") pod \"neutron-db-create-xpx4j\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:38 crc kubenswrapper[4912]: I0318 13:25:38.955426 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:39 crc kubenswrapper[4912]: I0318 13:25:39.409923 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-52ed-account-create-update-d48pr"] Mar 18 13:25:39 crc kubenswrapper[4912]: I0318 13:25:39.717349 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rbnsb"] Mar 18 13:25:39 crc kubenswrapper[4912]: I0318 13:25:39.734352 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-skzn7"] Mar 18 13:25:39 crc kubenswrapper[4912]: I0318 13:25:39.948853 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-55c0-account-create-update-dg7pc"] Mar 18 13:25:39 crc kubenswrapper[4912]: I0318 13:25:39.992633 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ebf9-account-create-update-8tl2r"] Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.115715 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dhhd8"] Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.155386 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-de81-account-create-update-mxs2s"] Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.191281 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ebf9-account-create-update-8tl2r" event={"ID":"d7327a84-0a21-4528-bd67-8a43d103e004","Type":"ContainerStarted","Data":"5e29ae04406d7d3d8de8a3c88556cababcc1f20048e926f2a2558bba2e6910fd"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.196882 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-55c0-account-create-update-dg7pc" event={"ID":"4eef913b-f65b-41a9-b0fa-3463914463f5","Type":"ContainerStarted","Data":"140424eedbf0b6e8b70beed449b7c62982692e3c3211dd347532bb0e890c5449"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.207973 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-skzn7" event={"ID":"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1","Type":"ContainerStarted","Data":"e5322b70887d51db1fa8782b962ea0ad6085b1831b04dfe55a6b1aea8121c544"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.281678 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-52ed-account-create-update-d48pr" podStartSLOduration=3.28164861 podStartE2EDuration="3.28164861s" podCreationTimestamp="2026-03-18 13:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:25:40.270268955 +0000 UTC m=+1388.729696380" watchObservedRunningTime="2026-03-18 13:25:40.28164861 +0000 UTC m=+1388.741076025" Mar 18 13:25:40 crc kubenswrapper[4912]: W0318 13:25:40.394203 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5da76f4_5031_4a81_ae19_96d01814f859.slice/crio-4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01 WatchSource:0}: Error finding container 4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01: Status 404 returned error can't find the container with id 4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01 Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.399823 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-52ed-account-create-update-d48pr" event={"ID":"6cb39d68-1138-410e-9577-197e9ff4b0c5","Type":"ContainerStarted","Data":"692160d8e33d62ebef21f2ea31a10e2a6bc99fa31d8ae9ccd0a76088412ca4f3"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.400542 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8cwkc"] Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.406461 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xpx4j"] Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.406637 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-52ed-account-create-update-d48pr" event={"ID":"6cb39d68-1138-410e-9577-197e9ff4b0c5","Type":"ContainerStarted","Data":"27dd77dc6961ad5dd3b6e3d22433acea2e1addad15e4efe4ca0ccdb719b07a1b"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.406819 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de81-account-create-update-mxs2s" event={"ID":"ffb2aa83-efef-4845-bfd5-ae8bf926f515","Type":"ContainerStarted","Data":"66aa3c38262140a1e7fe42461fd46595f3e144b38cf3bc34ed4445d526f45a9b"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.406960 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rbnsb" event={"ID":"4fa24fe7-cd66-47b6-9154-101f961c8482","Type":"ContainerStarted","Data":"7b79f60c92cbd348e4806f951240349bea2e2682d393e788ccce3d3cf4b26879"} Mar 18 13:25:40 crc kubenswrapper[4912]: I0318 13:25:40.407085 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhhd8" event={"ID":"c77cb1e2-3c24-41cb-95fa-ff54327ae194","Type":"ContainerStarted","Data":"bc11888c51d87539d8d51f511b13ce0cb28393dc0fbb94a81020804eb449a1a7"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.312634 4912 generic.go:334] "Generic (PLEG): container finished" podID="a5da76f4-5031-4a81-ae19-96d01814f859" containerID="93d0dbe94a524ec350f585f38bde45d37b221bde679bcb8649d08e11f3d416e7" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.312732 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8cwkc" event={"ID":"a5da76f4-5031-4a81-ae19-96d01814f859","Type":"ContainerDied","Data":"93d0dbe94a524ec350f585f38bde45d37b221bde679bcb8649d08e11f3d416e7"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.312767 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8cwkc" event={"ID":"a5da76f4-5031-4a81-ae19-96d01814f859","Type":"ContainerStarted","Data":"4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.315531 4912 generic.go:334] "Generic (PLEG): container finished" podID="d7327a84-0a21-4528-bd67-8a43d103e004" containerID="1ab04ce43a352a0bacc29661f56e9bbdaefe6338acde430dab65052025f4a6d5" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.315627 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ebf9-account-create-update-8tl2r" event={"ID":"d7327a84-0a21-4528-bd67-8a43d103e004","Type":"ContainerDied","Data":"1ab04ce43a352a0bacc29661f56e9bbdaefe6338acde430dab65052025f4a6d5"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.317747 4912 generic.go:334] "Generic (PLEG): container finished" podID="4eef913b-f65b-41a9-b0fa-3463914463f5" containerID="58530107af85bb2832504ec1e89d02bfe399c064c47b7b71709664a0cd2f12f7" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.317883 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-55c0-account-create-update-dg7pc" event={"ID":"4eef913b-f65b-41a9-b0fa-3463914463f5","Type":"ContainerDied","Data":"58530107af85bb2832504ec1e89d02bfe399c064c47b7b71709664a0cd2f12f7"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.320190 4912 generic.go:334] "Generic (PLEG): container finished" podID="a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" containerID="e981724626d8d5605b3301d04d3e928226dc72275ca7d1310ea94db1e9da75d5" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.320257 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-skzn7" event={"ID":"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1","Type":"ContainerDied","Data":"e981724626d8d5605b3301d04d3e928226dc72275ca7d1310ea94db1e9da75d5"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.322714 4912 generic.go:334] "Generic (PLEG): container finished" podID="6cb39d68-1138-410e-9577-197e9ff4b0c5" containerID="692160d8e33d62ebef21f2ea31a10e2a6bc99fa31d8ae9ccd0a76088412ca4f3" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.322763 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-52ed-account-create-update-d48pr" event={"ID":"6cb39d68-1138-410e-9577-197e9ff4b0c5","Type":"ContainerDied","Data":"692160d8e33d62ebef21f2ea31a10e2a6bc99fa31d8ae9ccd0a76088412ca4f3"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.325408 4912 generic.go:334] "Generic (PLEG): container finished" podID="ffb2aa83-efef-4845-bfd5-ae8bf926f515" containerID="c8664325e9a061c4521e14f190f2d5fa603154ae6abaaee3f76074f65dff12f3" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.325462 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de81-account-create-update-mxs2s" event={"ID":"ffb2aa83-efef-4845-bfd5-ae8bf926f515","Type":"ContainerDied","Data":"c8664325e9a061c4521e14f190f2d5fa603154ae6abaaee3f76074f65dff12f3"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.328013 4912 generic.go:334] "Generic (PLEG): container finished" podID="4fa24fe7-cd66-47b6-9154-101f961c8482" containerID="ebed85dc94f5fa30b16e2bc5ae45567364cac108b0a10991c181baaff476792d" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.328084 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rbnsb" event={"ID":"4fa24fe7-cd66-47b6-9154-101f961c8482","Type":"ContainerDied","Data":"ebed85dc94f5fa30b16e2bc5ae45567364cac108b0a10991c181baaff476792d"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.344467 4912 generic.go:334] "Generic (PLEG): container finished" podID="f6588379-d349-492e-a673-8f75b93fd640" containerID="f96c6c3d0bc66135ba7daef9263b6cad85ba1e4cdb4a0f14db822fde24af87ce" exitCode=0 Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.344543 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xpx4j" event={"ID":"f6588379-d349-492e-a673-8f75b93fd640","Type":"ContainerDied","Data":"f96c6c3d0bc66135ba7daef9263b6cad85ba1e4cdb4a0f14db822fde24af87ce"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.344583 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xpx4j" event={"ID":"f6588379-d349-492e-a673-8f75b93fd640","Type":"ContainerStarted","Data":"34b9cd6060cca26a7d1b314e56520a3e66ff9a64f7c4d3e344a70eecf64009a6"} Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.785233 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:41 crc kubenswrapper[4912]: I0318 13:25:41.790365 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:42 crc kubenswrapper[4912]: I0318 13:25:42.359306 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.609918 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.611193 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="prometheus" containerID="cri-o://d1af359093d0e429a68984f9a01904c2313a0a170d01fbf3508cc8e1dd2fbcab" gracePeriod=600 Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.611903 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="thanos-sidecar" containerID="cri-o://64c86d6e3878ff47f0697a7a5cbfc14a1eeee8ed2bf4671f91bc03cc3dbea0cd" gracePeriod=600 Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.611971 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="config-reloader" containerID="cri-o://a8d9f4d281be291c4edb28b0d4e59e0e7d084e8a8634af5b5794d113bd83311f" gracePeriod=600 Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.713292 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.832468 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:25:45 crc kubenswrapper[4912]: I0318 13:25:45.832841 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="dnsmasq-dns" containerID="cri-o://c310ab7b8dc3896908da85cfc27b03735933f37091211f10c8fb3db17cfe4826" gracePeriod=10 Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.439833 4912 generic.go:334] "Generic (PLEG): container finished" podID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerID="64c86d6e3878ff47f0697a7a5cbfc14a1eeee8ed2bf4671f91bc03cc3dbea0cd" exitCode=0 Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.440333 4912 generic.go:334] "Generic (PLEG): container finished" podID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerID="a8d9f4d281be291c4edb28b0d4e59e0e7d084e8a8634af5b5794d113bd83311f" exitCode=0 Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.440345 4912 generic.go:334] "Generic (PLEG): container finished" podID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerID="d1af359093d0e429a68984f9a01904c2313a0a170d01fbf3508cc8e1dd2fbcab" exitCode=0 Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.439939 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerDied","Data":"64c86d6e3878ff47f0697a7a5cbfc14a1eeee8ed2bf4671f91bc03cc3dbea0cd"} Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.440434 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerDied","Data":"a8d9f4d281be291c4edb28b0d4e59e0e7d084e8a8634af5b5794d113bd83311f"} Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.440454 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerDied","Data":"d1af359093d0e429a68984f9a01904c2313a0a170d01fbf3508cc8e1dd2fbcab"} Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.445245 4912 generic.go:334] "Generic (PLEG): container finished" podID="ae016812-7be3-4001-822e-9979bd4ce648" containerID="c310ab7b8dc3896908da85cfc27b03735933f37091211f10c8fb3db17cfe4826" exitCode=0 Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.445296 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" event={"ID":"ae016812-7be3-4001-822e-9979bd4ce648","Type":"ContainerDied","Data":"c310ab7b8dc3896908da85cfc27b03735933f37091211f10c8fb3db17cfe4826"} Mar 18 13:25:46 crc kubenswrapper[4912]: I0318 13:25:46.785726 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.142:9090/-/ready\": dial tcp 10.217.0.142:9090: connect: connection refused" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.438758 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.469512 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.482591 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.498026 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.503787 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.521087 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-de81-account-create-update-mxs2s" event={"ID":"ffb2aa83-efef-4845-bfd5-ae8bf926f515","Type":"ContainerDied","Data":"66aa3c38262140a1e7fe42461fd46595f3e144b38cf3bc34ed4445d526f45a9b"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.521628 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66aa3c38262140a1e7fe42461fd46595f3e144b38cf3bc34ed4445d526f45a9b" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.521624 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-de81-account-create-update-mxs2s" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.536864 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rbnsb" event={"ID":"4fa24fe7-cd66-47b6-9154-101f961c8482","Type":"ContainerDied","Data":"7b79f60c92cbd348e4806f951240349bea2e2682d393e788ccce3d3cf4b26879"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.536930 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b79f60c92cbd348e4806f951240349bea2e2682d393e788ccce3d3cf4b26879" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.537009 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rbnsb" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.537001 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-skzn7" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.542031 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkxgv\" (UniqueName: \"kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv\") pod \"6cb39d68-1138-410e-9577-197e9ff4b0c5\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.542660 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts\") pod \"6cb39d68-1138-410e-9577-197e9ff4b0c5\" (UID: \"6cb39d68-1138-410e-9577-197e9ff4b0c5\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.549031 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cb39d68-1138-410e-9577-197e9ff4b0c5" (UID: "6cb39d68-1138-410e-9577-197e9ff4b0c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.551149 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xpx4j" event={"ID":"f6588379-d349-492e-a673-8f75b93fd640","Type":"ContainerDied","Data":"34b9cd6060cca26a7d1b314e56520a3e66ff9a64f7c4d3e344a70eecf64009a6"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.551229 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34b9cd6060cca26a7d1b314e56520a3e66ff9a64f7c4d3e344a70eecf64009a6" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.562124 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv" (OuterVolumeSpecName: "kube-api-access-qkxgv") pod "6cb39d68-1138-410e-9577-197e9ff4b0c5" (UID: "6cb39d68-1138-410e-9577-197e9ff4b0c5"). InnerVolumeSpecName "kube-api-access-qkxgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.573952 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8cwkc" event={"ID":"a5da76f4-5031-4a81-ae19-96d01814f859","Type":"ContainerDied","Data":"4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.574020 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e4803b26b008ee939929e107b38250954ddad8e860b00a848cf091c2be2da01" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.574258 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8cwkc" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.589415 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ebf9-account-create-update-8tl2r" event={"ID":"d7327a84-0a21-4528-bd67-8a43d103e004","Type":"ContainerDied","Data":"5e29ae04406d7d3d8de8a3c88556cababcc1f20048e926f2a2558bba2e6910fd"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.589458 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e29ae04406d7d3d8de8a3c88556cababcc1f20048e926f2a2558bba2e6910fd" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.608405 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-55c0-account-create-update-dg7pc" event={"ID":"4eef913b-f65b-41a9-b0fa-3463914463f5","Type":"ContainerDied","Data":"140424eedbf0b6e8b70beed449b7c62982692e3c3211dd347532bb0e890c5449"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.608471 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140424eedbf0b6e8b70beed449b7c62982692e3c3211dd347532bb0e890c5449" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.608639 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-55c0-account-create-update-dg7pc" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.624737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-skzn7" event={"ID":"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1","Type":"ContainerDied","Data":"e5322b70887d51db1fa8782b962ea0ad6085b1831b04dfe55a6b1aea8121c544"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.624799 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5322b70887d51db1fa8782b962ea0ad6085b1831b04dfe55a6b1aea8121c544" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.624888 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-skzn7" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.632351 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-52ed-account-create-update-d48pr" event={"ID":"6cb39d68-1138-410e-9577-197e9ff4b0c5","Type":"ContainerDied","Data":"27dd77dc6961ad5dd3b6e3d22433acea2e1addad15e4efe4ca0ccdb719b07a1b"} Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.632441 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dd77dc6961ad5dd3b6e3d22433acea2e1addad15e4efe4ca0ccdb719b07a1b" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.632489 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-52ed-account-create-update-d48pr" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.645478 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts\") pod \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.645857 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6\") pod \"4eef913b-f65b-41a9-b0fa-3463914463f5\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.645965 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbq7k\" (UniqueName: \"kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k\") pod \"4fa24fe7-cd66-47b6-9154-101f961c8482\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.646068 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" (UID: "a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.646271 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts\") pod \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.647425 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm6mr\" (UniqueName: \"kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr\") pod \"a5da76f4-5031-4a81-ae19-96d01814f859\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.647675 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts\") pod \"4fa24fe7-cd66-47b6-9154-101f961c8482\" (UID: \"4fa24fe7-cd66-47b6-9154-101f961c8482\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.647796 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts\") pod \"4eef913b-f65b-41a9-b0fa-3463914463f5\" (UID: \"4eef913b-f65b-41a9-b0fa-3463914463f5\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.648231 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6sv9\" (UniqueName: \"kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9\") pod \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\" (UID: \"ffb2aa83-efef-4845-bfd5-ae8bf926f515\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.648434 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbc8n\" (UniqueName: \"kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n\") pod \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\" (UID: \"a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.648617 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts\") pod \"a5da76f4-5031-4a81-ae19-96d01814f859\" (UID: \"a5da76f4-5031-4a81-ae19-96d01814f859\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.649867 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cb39d68-1138-410e-9577-197e9ff4b0c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.649985 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkxgv\" (UniqueName: \"kubernetes.io/projected/6cb39d68-1138-410e-9577-197e9ff4b0c5-kube-api-access-qkxgv\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.650126 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.651273 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5da76f4-5031-4a81-ae19-96d01814f859" (UID: "a5da76f4-5031-4a81-ae19-96d01814f859"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.652177 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffb2aa83-efef-4845-bfd5-ae8bf926f515" (UID: "ffb2aa83-efef-4845-bfd5-ae8bf926f515"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.652818 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eef913b-f65b-41a9-b0fa-3463914463f5" (UID: "4eef913b-f65b-41a9-b0fa-3463914463f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.655983 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fa24fe7-cd66-47b6-9154-101f961c8482" (UID: "4fa24fe7-cd66-47b6-9154-101f961c8482"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.665537 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9" (OuterVolumeSpecName: "kube-api-access-j6sv9") pod "ffb2aa83-efef-4845-bfd5-ae8bf926f515" (UID: "ffb2aa83-efef-4845-bfd5-ae8bf926f515"). InnerVolumeSpecName "kube-api-access-j6sv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.667994 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr" (OuterVolumeSpecName: "kube-api-access-tm6mr") pod "a5da76f4-5031-4a81-ae19-96d01814f859" (UID: "a5da76f4-5031-4a81-ae19-96d01814f859"). InnerVolumeSpecName "kube-api-access-tm6mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.668124 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6" (OuterVolumeSpecName: "kube-api-access-p69k6") pod "4eef913b-f65b-41a9-b0fa-3463914463f5" (UID: "4eef913b-f65b-41a9-b0fa-3463914463f5"). InnerVolumeSpecName "kube-api-access-p69k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.668149 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n" (OuterVolumeSpecName: "kube-api-access-vbc8n") pod "a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" (UID: "a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1"). InnerVolumeSpecName "kube-api-access-vbc8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.668599 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k" (OuterVolumeSpecName: "kube-api-access-tbq7k") pod "4fa24fe7-cd66-47b6-9154-101f961c8482" (UID: "4fa24fe7-cd66-47b6-9154-101f961c8482"). InnerVolumeSpecName "kube-api-access-tbq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.673465 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.729602 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.756932 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5da76f4-5031-4a81-ae19-96d01814f859-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.756970 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4eef913b-f65b-41a9-b0fa-3463914463f5-kube-api-access-p69k6\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.756984 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbq7k\" (UniqueName: \"kubernetes.io/projected/4fa24fe7-cd66-47b6-9154-101f961c8482-kube-api-access-tbq7k\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.756994 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2aa83-efef-4845-bfd5-ae8bf926f515-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.757894 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm6mr\" (UniqueName: \"kubernetes.io/projected/a5da76f4-5031-4a81-ae19-96d01814f859-kube-api-access-tm6mr\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.757920 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa24fe7-cd66-47b6-9154-101f961c8482-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.757931 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eef913b-f65b-41a9-b0fa-3463914463f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.757940 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6sv9\" (UniqueName: \"kubernetes.io/projected/ffb2aa83-efef-4845-bfd5-ae8bf926f515-kube-api-access-j6sv9\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.757952 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbc8n\" (UniqueName: \"kubernetes.io/projected/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1-kube-api-access-vbc8n\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.809159 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.860304 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts\") pod \"f6588379-d349-492e-a673-8f75b93fd640\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.860843 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts\") pod \"d7327a84-0a21-4528-bd67-8a43d103e004\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.861139 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfz7k\" (UniqueName: \"kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k\") pod \"f6588379-d349-492e-a673-8f75b93fd640\" (UID: \"f6588379-d349-492e-a673-8f75b93fd640\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.861218 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w4p6\" (UniqueName: \"kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6\") pod \"d7327a84-0a21-4528-bd67-8a43d103e004\" (UID: \"d7327a84-0a21-4528-bd67-8a43d103e004\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.861451 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6588379-d349-492e-a673-8f75b93fd640" (UID: "f6588379-d349-492e-a673-8f75b93fd640"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.861751 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7327a84-0a21-4528-bd67-8a43d103e004" (UID: "d7327a84-0a21-4528-bd67-8a43d103e004"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.862746 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6588379-d349-492e-a673-8f75b93fd640-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.862772 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7327a84-0a21-4528-bd67-8a43d103e004-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.890550 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6" (OuterVolumeSpecName: "kube-api-access-8w4p6") pod "d7327a84-0a21-4528-bd67-8a43d103e004" (UID: "d7327a84-0a21-4528-bd67-8a43d103e004"). InnerVolumeSpecName "kube-api-access-8w4p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.894863 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k" (OuterVolumeSpecName: "kube-api-access-nfz7k") pod "f6588379-d349-492e-a673-8f75b93fd640" (UID: "f6588379-d349-492e-a673-8f75b93fd640"). InnerVolumeSpecName "kube-api-access-nfz7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.936367 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.986614 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb\") pod \"ae016812-7be3-4001-822e-9979bd4ce648\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.986702 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.986760 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.986844 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config\") pod \"ae016812-7be3-4001-822e-9979bd4ce648\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987010 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987191 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987229 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987255 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987383 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987416 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb\") pod \"ae016812-7be3-4001-822e-9979bd4ce648\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987514 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc\") pod \"ae016812-7be3-4001-822e-9979bd4ce648\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987560 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987629 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987663 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4dvw\" (UniqueName: \"kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw\") pod \"ae016812-7be3-4001-822e-9979bd4ce648\" (UID: \"ae016812-7be3-4001-822e-9979bd4ce648\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.987692 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klxcc\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc\") pod \"73069f34-9c0b-4204-a2f3-8b283232ce86\" (UID: \"73069f34-9c0b-4204-a2f3-8b283232ce86\") " Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.994771 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:47 crc kubenswrapper[4912]: I0318 13:25:47.995953 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.002379 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw" (OuterVolumeSpecName: "kube-api-access-v4dvw") pod "ae016812-7be3-4001-822e-9979bd4ce648" (UID: "ae016812-7be3-4001-822e-9979bd4ce648"). InnerVolumeSpecName "kube-api-access-v4dvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.003816 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfz7k\" (UniqueName: \"kubernetes.io/projected/f6588379-d349-492e-a673-8f75b93fd640-kube-api-access-nfz7k\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.003873 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w4p6\" (UniqueName: \"kubernetes.io/projected/d7327a84-0a21-4528-bd67-8a43d103e004-kube-api-access-8w4p6\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.003888 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4dvw\" (UniqueName: \"kubernetes.io/projected/ae016812-7be3-4001-822e-9979bd4ce648-kube-api-access-v4dvw\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.003903 4912 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.003921 4912 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.004245 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out" (OuterVolumeSpecName: "config-out") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.009139 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config" (OuterVolumeSpecName: "config") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.020691 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.043858 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.050389 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc" (OuterVolumeSpecName: "kube-api-access-klxcc") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "kube-api-access-klxcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.056813 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.089987 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "pvc-0a774717-713f-4d74-b299-ab1f68cce60e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.096219 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config" (OuterVolumeSpecName: "web-config") pod "73069f34-9c0b-4204-a2f3-8b283232ce86" (UID: "73069f34-9c0b-4204-a2f3-8b283232ce86"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.107916 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config" (OuterVolumeSpecName: "config") pod "ae016812-7be3-4001-822e-9979bd4ce648" (UID: "ae016812-7be3-4001-822e-9979bd4ce648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108451 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108503 4912 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108515 4912 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73069f34-9c0b-4204-a2f3-8b283232ce86-config-out\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108527 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klxcc\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-kube-api-access-klxcc\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108538 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108579 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") on node \"crc\" " Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108590 4912 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/73069f34-9c0b-4204-a2f3-8b283232ce86-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108605 4912 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73069f34-9c0b-4204-a2f3-8b283232ce86-web-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.108620 4912 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73069f34-9c0b-4204-a2f3-8b283232ce86-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.119332 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae016812-7be3-4001-822e-9979bd4ce648" (UID: "ae016812-7be3-4001-822e-9979bd4ce648"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.137839 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae016812-7be3-4001-822e-9979bd4ce648" (UID: "ae016812-7be3-4001-822e-9979bd4ce648"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.150750 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.151088 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0a774717-713f-4d74-b299-ab1f68cce60e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e") on node "crc" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.153409 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae016812-7be3-4001-822e-9979bd4ce648" (UID: "ae016812-7be3-4001-822e-9979bd4ce648"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.210815 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.210855 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.210867 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae016812-7be3-4001-822e-9979bd4ce648-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.210881 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.332352 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae016812_7be3_4001_822e_9979bd4ce648.slice\": RecentStats: unable to find data in memory cache]" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.353765 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae016812_7be3_4001_822e_9979bd4ce648.slice/crio-f1412cdb532d7d81c4a674ac082fe9bfc07ec9c48e8f0df40cbe6f12704603e0\": RecentStats: unable to find data in memory cache]" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.647898 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"73069f34-9c0b-4204-a2f3-8b283232ce86","Type":"ContainerDied","Data":"5a30928ffa23aea28118024b95923d71356d6df1864b6e4c7e30f5adf0bc5bfe"} Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.647969 4912 scope.go:117] "RemoveContainer" containerID="64c86d6e3878ff47f0697a7a5cbfc14a1eeee8ed2bf4671f91bc03cc3dbea0cd" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.648155 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.662811 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" event={"ID":"ae016812-7be3-4001-822e-9979bd4ce648","Type":"ContainerDied","Data":"f1412cdb532d7d81c4a674ac082fe9bfc07ec9c48e8f0df40cbe6f12704603e0"} Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.663110 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-449s7" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.677468 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ebf9-account-create-update-8tl2r" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.678632 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhhd8" event={"ID":"c77cb1e2-3c24-41cb-95fa-ff54327ae194","Type":"ContainerStarted","Data":"cc8193a78a993ab1c9ae50a3bacb9b07cb5c6845be38a01286cf948cccc10050"} Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.678786 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xpx4j" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.691267 4912 scope.go:117] "RemoveContainer" containerID="a8d9f4d281be291c4edb28b0d4e59e0e7d084e8a8634af5b5794d113bd83311f" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.712835 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.738300 4912 scope.go:117] "RemoveContainer" containerID="d1af359093d0e429a68984f9a01904c2313a0a170d01fbf3508cc8e1dd2fbcab" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.738937 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.763028 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.789148 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-449s7"] Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.813223 4912 scope.go:117] "RemoveContainer" containerID="61cd9944abf703e9cd8146a733e46295eaa8dbab025cf3ccba1d6d52b50d2ed2" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.839146 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.839924 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7327a84-0a21-4528-bd67-8a43d103e004" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.839945 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7327a84-0a21-4528-bd67-8a43d103e004" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.839968 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="thanos-sidecar" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.839975 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="thanos-sidecar" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.839984 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa24fe7-cd66-47b6-9154-101f961c8482" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.839992 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa24fe7-cd66-47b6-9154-101f961c8482" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840008 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eef913b-f65b-41a9-b0fa-3463914463f5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840016 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eef913b-f65b-41a9-b0fa-3463914463f5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840029 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="dnsmasq-dns" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840062 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="dnsmasq-dns" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840079 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="init" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840088 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="init" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840103 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb2aa83-efef-4845-bfd5-ae8bf926f515" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840112 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb2aa83-efef-4845-bfd5-ae8bf926f515" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840124 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="init-config-reloader" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840133 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="init-config-reloader" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840158 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840166 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840176 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb39d68-1138-410e-9577-197e9ff4b0c5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840184 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb39d68-1138-410e-9577-197e9ff4b0c5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840199 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5da76f4-5031-4a81-ae19-96d01814f859" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840208 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5da76f4-5031-4a81-ae19-96d01814f859" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840226 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6588379-d349-492e-a673-8f75b93fd640" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840234 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6588379-d349-492e-a673-8f75b93fd640" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840258 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="prometheus" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840266 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="prometheus" Mar 18 13:25:48 crc kubenswrapper[4912]: E0318 13:25:48.840284 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="config-reloader" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840293 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="config-reloader" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840549 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="prometheus" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840565 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb2aa83-efef-4845-bfd5-ae8bf926f515" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840582 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="thanos-sidecar" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840592 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eef913b-f65b-41a9-b0fa-3463914463f5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840605 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" containerName="config-reloader" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840617 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae016812-7be3-4001-822e-9979bd4ce648" containerName="dnsmasq-dns" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840633 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840644 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa24fe7-cd66-47b6-9154-101f961c8482" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840651 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6588379-d349-492e-a673-8f75b93fd640" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840662 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7327a84-0a21-4528-bd67-8a43d103e004" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840671 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5da76f4-5031-4a81-ae19-96d01814f859" containerName="mariadb-database-create" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.840676 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb39d68-1138-410e-9577-197e9ff4b0c5" containerName="mariadb-account-create-update" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.843171 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.851635 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.851665 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.851916 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.852012 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.852158 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.852522 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.859283 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.859655 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l5hz4" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.874966 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dhhd8" podStartSLOduration=3.749362053 podStartE2EDuration="10.874937036s" podCreationTimestamp="2026-03-18 13:25:38 +0000 UTC" firstStartedPulling="2026-03-18 13:25:40.187472672 +0000 UTC m=+1388.646900097" lastFinishedPulling="2026-03-18 13:25:47.313047655 +0000 UTC m=+1395.772475080" observedRunningTime="2026-03-18 13:25:48.816568189 +0000 UTC m=+1397.275995624" watchObservedRunningTime="2026-03-18 13:25:48.874937036 +0000 UTC m=+1397.334364461" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.875219 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.936851 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:48 crc kubenswrapper[4912]: I0318 13:25:48.949821 4912 scope.go:117] "RemoveContainer" containerID="c310ab7b8dc3896908da85cfc27b03735933f37091211f10c8fb3db17cfe4826" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.057083 4912 scope.go:117] "RemoveContainer" containerID="911443aed4c6547a6b79646429ce2f624e9ba8e926b2288c0dda2cfeed9cfe5c" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059300 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059354 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpm9k\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-kube-api-access-kpm9k\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059383 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059418 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059469 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059495 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059548 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059579 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a406878a-6e90-4c47-8e23-875349b55b1d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059610 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059662 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059708 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059757 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.059779 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.162842 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.162932 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.163005 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.163066 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpm9k\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-kube-api-access-kpm9k\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.163111 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.170462 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.175126 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.175627 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.175773 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.175919 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.176093 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a406878a-6e90-4c47-8e23-875349b55b1d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.176219 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.176435 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.176606 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.177650 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.178387 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a406878a-6e90-4c47-8e23-875349b55b1d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.185389 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.191647 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.195450 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a406878a-6e90-4c47-8e23-875349b55b1d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.206828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.212095 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.214864 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.217273 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a406878a-6e90-4c47-8e23-875349b55b1d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.217468 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.217967 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpm9k\" (UniqueName: \"kubernetes.io/projected/a406878a-6e90-4c47-8e23-875349b55b1d-kube-api-access-kpm9k\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.352698 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.352760 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/762887b1ff77fd04cff9a8d1f3f0d3bfb1e91ae8558b3b5a91b139edd2c848bf/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.429020 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a774717-713f-4d74-b299-ab1f68cce60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a774717-713f-4d74-b299-ab1f68cce60e\") pod \"prometheus-metric-storage-0\" (UID: \"a406878a-6e90-4c47-8e23-875349b55b1d\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:49 crc kubenswrapper[4912]: I0318 13:25:49.549396 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:25:50 crc kubenswrapper[4912]: I0318 13:25:50.133460 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:25:50 crc kubenswrapper[4912]: I0318 13:25:50.244508 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73069f34-9c0b-4204-a2f3-8b283232ce86" path="/var/lib/kubelet/pods/73069f34-9c0b-4204-a2f3-8b283232ce86/volumes" Mar 18 13:25:50 crc kubenswrapper[4912]: I0318 13:25:50.245633 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae016812-7be3-4001-822e-9979bd4ce648" path="/var/lib/kubelet/pods/ae016812-7be3-4001-822e-9979bd4ce648/volumes" Mar 18 13:25:50 crc kubenswrapper[4912]: I0318 13:25:50.722443 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerStarted","Data":"9fc73a82a41aac99e3fb0440e7b78ed54759a9fa2a4a5327f4a9f4003628a158"} Mar 18 13:25:54 crc kubenswrapper[4912]: I0318 13:25:54.769682 4912 generic.go:334] "Generic (PLEG): container finished" podID="a92c61dc-cfdf-4610-81b7-553c9882fc26" containerID="7b3f3e459c9334390333b26ef9e0b392b8bf6f6ab79f21810221ba342e4bc120" exitCode=0 Mar 18 13:25:54 crc kubenswrapper[4912]: I0318 13:25:54.769773 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-krlbd" event={"ID":"a92c61dc-cfdf-4610-81b7-553c9882fc26","Type":"ContainerDied","Data":"7b3f3e459c9334390333b26ef9e0b392b8bf6f6ab79f21810221ba342e4bc120"} Mar 18 13:25:54 crc kubenswrapper[4912]: I0318 13:25:54.773714 4912 generic.go:334] "Generic (PLEG): container finished" podID="c77cb1e2-3c24-41cb-95fa-ff54327ae194" containerID="cc8193a78a993ab1c9ae50a3bacb9b07cb5c6845be38a01286cf948cccc10050" exitCode=0 Mar 18 13:25:54 crc kubenswrapper[4912]: I0318 13:25:54.773769 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhhd8" event={"ID":"c77cb1e2-3c24-41cb-95fa-ff54327ae194","Type":"ContainerDied","Data":"cc8193a78a993ab1c9ae50a3bacb9b07cb5c6845be38a01286cf948cccc10050"} Mar 18 13:25:54 crc kubenswrapper[4912]: I0318 13:25:54.776439 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerStarted","Data":"2bee13df2c7079f84e2d0a71cc1e1c800bc83a92926b862da769f4a3aec73572"} Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.421493 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.540359 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5rmq\" (UniqueName: \"kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq\") pod \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.540530 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data\") pod \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.540692 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle\") pod \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\" (UID: \"c77cb1e2-3c24-41cb-95fa-ff54327ae194\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.554282 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq" (OuterVolumeSpecName: "kube-api-access-h5rmq") pod "c77cb1e2-3c24-41cb-95fa-ff54327ae194" (UID: "c77cb1e2-3c24-41cb-95fa-ff54327ae194"). InnerVolumeSpecName "kube-api-access-h5rmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.572888 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c77cb1e2-3c24-41cb-95fa-ff54327ae194" (UID: "c77cb1e2-3c24-41cb-95fa-ff54327ae194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.601352 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data" (OuterVolumeSpecName: "config-data") pod "c77cb1e2-3c24-41cb-95fa-ff54327ae194" (UID: "c77cb1e2-3c24-41cb-95fa-ff54327ae194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.615707 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-krlbd" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.645212 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.645256 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77cb1e2-3c24-41cb-95fa-ff54327ae194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.645271 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5rmq\" (UniqueName: \"kubernetes.io/projected/c77cb1e2-3c24-41cb-95fa-ff54327ae194-kube-api-access-h5rmq\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.747393 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data\") pod \"a92c61dc-cfdf-4610-81b7-553c9882fc26\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.747526 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle\") pod \"a92c61dc-cfdf-4610-81b7-553c9882fc26\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.747793 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data\") pod \"a92c61dc-cfdf-4610-81b7-553c9882fc26\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.747860 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psnkw\" (UniqueName: \"kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw\") pod \"a92c61dc-cfdf-4610-81b7-553c9882fc26\" (UID: \"a92c61dc-cfdf-4610-81b7-553c9882fc26\") " Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.752283 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw" (OuterVolumeSpecName: "kube-api-access-psnkw") pod "a92c61dc-cfdf-4610-81b7-553c9882fc26" (UID: "a92c61dc-cfdf-4610-81b7-553c9882fc26"). InnerVolumeSpecName "kube-api-access-psnkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.752860 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a92c61dc-cfdf-4610-81b7-553c9882fc26" (UID: "a92c61dc-cfdf-4610-81b7-553c9882fc26"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.777928 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a92c61dc-cfdf-4610-81b7-553c9882fc26" (UID: "a92c61dc-cfdf-4610-81b7-553c9882fc26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.804949 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhhd8" event={"ID":"c77cb1e2-3c24-41cb-95fa-ff54327ae194","Type":"ContainerDied","Data":"bc11888c51d87539d8d51f511b13ce0cb28393dc0fbb94a81020804eb449a1a7"} Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.805002 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc11888c51d87539d8d51f511b13ce0cb28393dc0fbb94a81020804eb449a1a7" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.804999 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhhd8" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.806733 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-krlbd" event={"ID":"a92c61dc-cfdf-4610-81b7-553c9882fc26","Type":"ContainerDied","Data":"2bbe7984a5c43471ab5affb70afed900f611123c519ef430b2bc565ab20b6a99"} Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.806775 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-krlbd" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.806788 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bbe7984a5c43471ab5affb70afed900f611123c519ef430b2bc565ab20b6a99" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.821309 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data" (OuterVolumeSpecName: "config-data") pod "a92c61dc-cfdf-4610-81b7-553c9882fc26" (UID: "a92c61dc-cfdf-4610-81b7-553c9882fc26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.850290 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.850332 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.850343 4912 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a92c61dc-cfdf-4610-81b7-553c9882fc26-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:56 crc kubenswrapper[4912]: I0318 13:25:56.850354 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psnkw\" (UniqueName: \"kubernetes.io/projected/a92c61dc-cfdf-4610-81b7-553c9882fc26-kube-api-access-psnkw\") on node \"crc\" DevicePath \"\"" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.137715 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:25:57 crc kubenswrapper[4912]: E0318 13:25:57.145173 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" containerName="glance-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.145208 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" containerName="glance-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: E0318 13:25:57.145223 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77cb1e2-3c24-41cb-95fa-ff54327ae194" containerName="keystone-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.145234 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77cb1e2-3c24-41cb-95fa-ff54327ae194" containerName="keystone-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.145476 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" containerName="glance-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.145504 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77cb1e2-3c24-41cb-95fa-ff54327ae194" containerName="keystone-db-sync" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.146985 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.161963 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bm267"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.165946 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.188761 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.189028 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.189197 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.189321 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knn24" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.189435 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.199401 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264206 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264297 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264325 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264437 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzjc\" (UniqueName: \"kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264512 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264554 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264603 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264676 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264709 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264743 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264775 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6q6k\" (UniqueName: \"kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.264845 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.265415 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bm267"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381735 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzjc\" (UniqueName: \"kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381810 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381829 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381859 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381949 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.381977 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382001 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382027 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6q6k\" (UniqueName: \"kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382124 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382202 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382240 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.382266 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.386684 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.390692 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.391964 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.394375 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.395456 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.399934 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.400951 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.410011 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.418242 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.419694 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.478168 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-j4dhx"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.480384 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.481872 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzjc\" (UniqueName: \"kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc\") pod \"keystone-bootstrap-bm267\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.484900 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-cd8zm" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.499367 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.500274 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6q6k\" (UniqueName: \"kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k\") pod \"dnsmasq-dns-5b868669f-9hmv5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.517444 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bm267" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.518629 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j4dhx"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.574941 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.576189 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.587160 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.587342 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.587826 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ts8\" (UniqueName: \"kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.612092 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fbf2l"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.613969 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.625388 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.625783 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xv548" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.654859 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.719915 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ts8\" (UniqueName: \"kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.720501 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpg5\" (UniqueName: \"kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.720622 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.720864 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.720894 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.721024 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.721102 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.721193 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.721240 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.723282 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fbf2l"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.757013 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.768823 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.785668 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ts8\" (UniqueName: \"kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8\") pod \"heat-db-sync-j4dhx\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.818804 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-6zf76"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.828875 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpg5\" (UniqueName: \"kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.828987 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.829165 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.829254 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.829308 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.829334 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.830690 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.840303 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.848357 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k5qfz" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.848600 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.853357 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.866846 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.867761 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.875884 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.894905 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4dhx" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.903733 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.916278 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpg5\" (UniqueName: \"kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5\") pod \"cinder-db-sync-fbf2l\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.933526 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.936677 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.936962 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rccx\" (UniqueName: \"kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.937172 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.941885 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:57 crc kubenswrapper[4912]: I0318 13:25:57.950930 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6zf76"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.000142 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.023782 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.053173 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-d7x2r"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.056327 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.056419 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.056483 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.056546 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsxz\" (UniqueName: \"kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.058139 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.059750 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.063229 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rccx\" (UniqueName: \"kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.063468 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.063626 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.063763 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.075446 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xlmlb" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.076588 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.084184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.094066 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.118541 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rccx\" (UniqueName: \"kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx\") pod \"neutron-db-sync-6zf76\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.137470 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d7x2r"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.156587 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-tlb72"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.158427 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169236 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169304 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169438 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsxz\" (UniqueName: \"kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169502 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvpg\" (UniqueName: \"kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169528 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169573 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169599 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169668 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.169686 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.170724 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.171191 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.171442 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.171784 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqbzz" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.172581 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.198255 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.204603 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.204828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.222969 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsxz\" (UniqueName: \"kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz\") pod \"dnsmasq-dns-bbf5cc879-td5nc\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.268808 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tlb72"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272346 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272414 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272462 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272486 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272528 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nrt\" (UniqueName: \"kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272603 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvpg\" (UniqueName: \"kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272654 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.272710 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.287022 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.298975 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.306286 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6zf76" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.320138 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.321590 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.377507 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvpg\" (UniqueName: \"kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg\") pod \"barbican-db-sync-d7x2r\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.379184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.379232 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.379278 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.379311 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nrt\" (UniqueName: \"kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.379424 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.383100 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.393409 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.395677 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.397738 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.423225 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.428434 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.450833 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.459027 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nrt\" (UniqueName: \"kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt\") pod \"placement-db-sync-tlb72\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.490270 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.515364 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tlb72" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.527828 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.527895 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrghc\" (UniqueName: \"kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.528160 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.528519 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.528590 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.528702 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.612803 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.641933 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.642010 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.642090 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.642111 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrghc\" (UniqueName: \"kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.642162 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.642245 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.652109 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.652756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.653491 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.658978 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.669451 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.671482 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.746554 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.760614 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.764231 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.813862 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.814027 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5r2\" (UniqueName: \"kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.826084 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrghc\" (UniqueName: \"kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc\") pod \"dnsmasq-dns-56df8fb6b7-gp7xf\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.860079 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.860169 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.884174 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.884245 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.884464 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:58 crc kubenswrapper[4912]: I0318 13:25:58.979851 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.019745 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.022470 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5r2\" (UniqueName: \"kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.022697 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.022816 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.023263 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.023319 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.023655 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.043627 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.043989 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.049338 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.056700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.127322 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.128121 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.134236 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bm267" event={"ID":"7066bcdd-2c08-451c-9985-a103bbd007ab","Type":"ContainerStarted","Data":"6ff1a6c726cc8893c8c1c04d8e5688056a877d6c9eb3d67704e4217fd833a240"} Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.162722 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5r2\" (UniqueName: \"kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2\") pod \"ceilometer-0\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.162826 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.166884 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.198383 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lm6d9" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.198629 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.199526 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.214788 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.255348 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bm267"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.346951 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355105 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355237 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355279 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355333 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355361 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmrn\" (UniqueName: \"kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355367 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355420 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.355447 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.360530 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.389162 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.422574 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466088 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466171 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466230 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466323 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466437 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466481 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdmrn\" (UniqueName: \"kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466577 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466817 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.466922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvlp\" (UniqueName: \"kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.467004 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.467065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.467129 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.467164 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.467504 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.479350 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.480135 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.495829 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.498327 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.507719 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.507764 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a72a99aa2c60b032f6beea238cfa834e3b34d6f5ea6f566ee310bef28b7d80e1/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.509011 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.561368 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fbf2l"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.570159 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdmrn\" (UniqueName: \"kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.571991 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572084 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572166 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572198 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572226 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvlp\" (UniqueName: \"kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572260 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.572289 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.578195 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.578222 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.582479 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.584247 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.584277 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26f2544761a6fbc0806c14fe58682cf74b4da7181bc2e0537e305660953f7255/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.595875 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.602087 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j4dhx"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.614925 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.615576 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvlp\" (UniqueName: \"kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.622597 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.625408 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.663477 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.801770 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6zf76"] Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.836781 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:25:59 crc kubenswrapper[4912]: I0318 13:25:59.872890 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.076575 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:00 crc kubenswrapper[4912]: E0318 13:26:00.107835 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda406878a_6e90_4c47_8e23_875349b55b1d.slice/crio-2bee13df2c7079f84e2d0a71cc1e1c800bc83a92926b862da769f4a3aec73572.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:26:00 crc kubenswrapper[4912]: E0318 13:26:00.108678 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda406878a_6e90_4c47_8e23_875349b55b1d.slice/crio-conmon-2bee13df2c7079f84e2d0a71cc1e1c800bc83a92926b862da769f4a3aec73572.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.199909 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564006-nt9g8"] Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.216318 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.224775 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.235162 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.292381 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.333757 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-nt9g8"] Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.351727 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" event={"ID":"068dd8a7-2c6a-4da2-abe3-37df375434a5","Type":"ContainerStarted","Data":"37c7e694dec22f37f9aae59f46baf624cdc953b5ca03d31cbdefa1fe7e811969"} Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.360038 4912 generic.go:334] "Generic (PLEG): container finished" podID="a406878a-6e90-4c47-8e23-875349b55b1d" containerID="2bee13df2c7079f84e2d0a71cc1e1c800bc83a92926b862da769f4a3aec73572" exitCode=0 Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.360142 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerDied","Data":"2bee13df2c7079f84e2d0a71cc1e1c800bc83a92926b862da769f4a3aec73572"} Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.380953 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4dhx" event={"ID":"539c384f-3502-4474-9c40-432909696dfb","Type":"ContainerStarted","Data":"2a08a6f6f90c5430e8a4da755dee841ff7eebb140c794997f39763059563d0c4"} Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.426983 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6zf76" event={"ID":"39c9cb6b-7493-434e-b069-61ce32dcdc95","Type":"ContainerStarted","Data":"f9ebd39260084d1ab145c9a58f1754595890a014688022cead65666426eaa466"} Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.434723 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fbf2l" event={"ID":"147c4d2b-19d3-48da-9364-c527a1cacc3c","Type":"ContainerStarted","Data":"81488a03a7d89c2f077207011325153300696101039858ec7ff6856643ad1127"} Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.444258 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn8mf\" (UniqueName: \"kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf\") pod \"auto-csr-approver-29564006-nt9g8\" (UID: \"366d4bcf-ceeb-48e7-a834-c579166036b6\") " pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.464129 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bm267" podStartSLOduration=3.46402811 podStartE2EDuration="3.46402811s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:00.448662198 +0000 UTC m=+1408.908089623" watchObservedRunningTime="2026-03-18 13:26:00.46402811 +0000 UTC m=+1408.923455535" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.572543 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn8mf\" (UniqueName: \"kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf\") pod \"auto-csr-approver-29564006-nt9g8\" (UID: \"366d4bcf-ceeb-48e7-a834-c579166036b6\") " pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.628527 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn8mf\" (UniqueName: \"kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf\") pod \"auto-csr-approver-29564006-nt9g8\" (UID: \"366d4bcf-ceeb-48e7-a834-c579166036b6\") " pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:00 crc kubenswrapper[4912]: W0318 13:26:00.766526 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f5e6082_788c_4556_b53c_d4b62b9649db.slice/crio-9f72946f5a2f4b4d6777b04a76547277032c6441f3eb021729198971733104c1 WatchSource:0}: Error finding container 9f72946f5a2f4b4d6777b04a76547277032c6441f3eb021729198971733104c1: Status 404 returned error can't find the container with id 9f72946f5a2f4b4d6777b04a76547277032c6441f3eb021729198971733104c1 Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.770228 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d7x2r"] Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.800297 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:26:00 crc kubenswrapper[4912]: I0318 13:26:00.866222 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.152618 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.192565 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tlb72"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.219016 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:26:01 crc kubenswrapper[4912]: W0318 13:26:01.272234 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4cdd8f3_f85b_4a98_a164_bc9462b4932f.slice/crio-503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9 WatchSource:0}: Error finding container 503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9: Status 404 returned error can't find the container with id 503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9 Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.448529 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.548692 4912 generic.go:334] "Generic (PLEG): container finished" podID="068dd8a7-2c6a-4da2-abe3-37df375434a5" containerID="5453a99390c2c9fcf65212ee209aa3257ad0d9f368cf36b7cc8c7a564157580f" exitCode=0 Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.549170 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" event={"ID":"068dd8a7-2c6a-4da2-abe3-37df375434a5","Type":"ContainerDied","Data":"5453a99390c2c9fcf65212ee209aa3257ad0d9f368cf36b7cc8c7a564157580f"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.568282 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerStarted","Data":"cc4d4b96a8d253f094a532d99734e82074ec72609ce62778cb6ee66c80ec75ec"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.588476 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerStarted","Data":"1a3da77c4a00a1566f180c11c497bde7fe9cd7ba7e0f9fe9aceb9742693f8415"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.605495 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6zf76" event={"ID":"39c9cb6b-7493-434e-b069-61ce32dcdc95","Type":"ContainerStarted","Data":"034f5fad2378d4be86d7c75fbbce5883543c7719a8084de0e56ed92b60e91656"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.615318 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d7x2r" event={"ID":"e7732fd2-b813-47e5-8f23-823a3037df09","Type":"ContainerStarted","Data":"5cb5e37974650a5a079286c131856bca16e42886df88e50d99080ae00dd0874e"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.651032 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-6zf76" podStartSLOduration=4.6510076399999996 podStartE2EDuration="4.65100764s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:01.643473607 +0000 UTC m=+1410.102901032" watchObservedRunningTime="2026-03-18 13:26:01.65100764 +0000 UTC m=+1410.110435065" Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.653143 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.663808 4912 generic.go:334] "Generic (PLEG): container finished" podID="1f5e6082-788c-4556-b53c-d4b62b9649db" containerID="16f206f2f71d3bb0357cffdf34f9eb6ee8af0cd1950291fe1f41c9aa9f020eac" exitCode=0 Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.663892 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" event={"ID":"1f5e6082-788c-4556-b53c-d4b62b9649db","Type":"ContainerDied","Data":"16f206f2f71d3bb0357cffdf34f9eb6ee8af0cd1950291fe1f41c9aa9f020eac"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.663930 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" event={"ID":"1f5e6082-788c-4556-b53c-d4b62b9649db","Type":"ContainerStarted","Data":"9f72946f5a2f4b4d6777b04a76547277032c6441f3eb021729198971733104c1"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.675948 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" event={"ID":"dadc5395-e931-4293-b037-929db9a9bd99","Type":"ContainerStarted","Data":"354b66a06e01eac2e53c80654ea772516def04f904e368cfa9c6dc40a63c9dc6"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.745313 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerStarted","Data":"b6366b7ca211b3428c1a82546d88fb2bf5e489c6caf3d25c771d3a4433ae3325"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.765291 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.800035 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-nt9g8"] Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.811447 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bm267" event={"ID":"7066bcdd-2c08-451c-9985-a103bbd007ab","Type":"ContainerStarted","Data":"454531e67e48c62d206f8691ac34b2f156047af5873d1a3c1c66859c808ee5e9"} Mar 18 13:26:01 crc kubenswrapper[4912]: I0318 13:26:01.872904 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tlb72" event={"ID":"a4cdd8f3-f85b-4a98-a164-bc9462b4932f","Type":"ContainerStarted","Data":"503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9"} Mar 18 13:26:01 crc kubenswrapper[4912]: W0318 13:26:01.889792 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod366d4bcf_ceeb_48e7_a834_c579166036b6.slice/crio-a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b WatchSource:0}: Error finding container a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b: Status 404 returned error can't find the container with id a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.333442 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.914485 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.961504 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.961603 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-9hmv5" event={"ID":"068dd8a7-2c6a-4da2-abe3-37df375434a5","Type":"ContainerDied","Data":"37c7e694dec22f37f9aae59f46baf624cdc953b5ca03d31cbdefa1fe7e811969"} Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.961707 4912 scope.go:117] "RemoveContainer" containerID="5453a99390c2c9fcf65212ee209aa3257ad0d9f368cf36b7cc8c7a564157580f" Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.963405 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:26:02 crc kubenswrapper[4912]: I0318 13:26:02.966274 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" event={"ID":"366d4bcf-ceeb-48e7-a834-c579166036b6","Type":"ContainerStarted","Data":"a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b"} Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.008317 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerStarted","Data":"3cbe4bcd0742839b3422ce810c1f0d8c11a782a7649bd5cdcd347ba538348b4b"} Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.060635 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.060720 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-td5nc" event={"ID":"1f5e6082-788c-4556-b53c-d4b62b9649db","Type":"ContainerDied","Data":"9f72946f5a2f4b4d6777b04a76547277032c6441f3eb021729198971733104c1"} Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072539 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072616 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072658 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6q6k\" (UniqueName: \"kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072695 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072719 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072743 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072815 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.072948 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.073096 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfsxz\" (UniqueName: \"kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.073143 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.073241 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc\") pod \"1f5e6082-788c-4556-b53c-d4b62b9649db\" (UID: \"1f5e6082-788c-4556-b53c-d4b62b9649db\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.073268 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb\") pod \"068dd8a7-2c6a-4da2-abe3-37df375434a5\" (UID: \"068dd8a7-2c6a-4da2-abe3-37df375434a5\") " Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.099933 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz" (OuterVolumeSpecName: "kube-api-access-mfsxz") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "kube-api-access-mfsxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.131432 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k" (OuterVolumeSpecName: "kube-api-access-n6q6k") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "kube-api-access-n6q6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.192968 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfsxz\" (UniqueName: \"kubernetes.io/projected/1f5e6082-788c-4556-b53c-d4b62b9649db-kube-api-access-mfsxz\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.200990 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6q6k\" (UniqueName: \"kubernetes.io/projected/068dd8a7-2c6a-4da2-abe3-37df375434a5-kube-api-access-n6q6k\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.226304 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.227634 4912 scope.go:117] "RemoveContainer" containerID="16f206f2f71d3bb0357cffdf34f9eb6ee8af0cd1950291fe1f41c9aa9f020eac" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.305472 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.409398 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.708263 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.729917 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.785786 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.833125 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.859172 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config" (OuterVolumeSpecName: "config") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.941559 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:03 crc kubenswrapper[4912]: I0318 13:26:03.947715 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.057358 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.170928 4912 generic.go:334] "Generic (PLEG): container finished" podID="dadc5395-e931-4293-b037-929db9a9bd99" containerID="2f277b3c369b730545258bccc92a22af36f8ec2036848a4f82c2603740b767ac" exitCode=0 Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.171168 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" event={"ID":"dadc5395-e931-4293-b037-929db9a9bd99","Type":"ContainerDied","Data":"2f277b3c369b730545258bccc92a22af36f8ec2036848a4f82c2603740b767ac"} Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.281743 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config" (OuterVolumeSpecName: "config") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.301817 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.307156 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.307229 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.312557 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.348899 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "068dd8a7-2c6a-4da2-abe3-37df375434a5" (UID: "068dd8a7-2c6a-4da2-abe3-37df375434a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.410248 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f5e6082-788c-4556-b53c-d4b62b9649db" (UID: "1f5e6082-788c-4556-b53c-d4b62b9649db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.410715 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.410742 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/068dd8a7-2c6a-4da2-abe3-37df375434a5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4912]: I0318 13:26:04.410758 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5e6082-788c-4556-b53c-d4b62b9649db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:05 crc kubenswrapper[4912]: I0318 13:26:05.095227 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:26:05 crc kubenswrapper[4912]: I0318 13:26:05.108452 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-9hmv5"] Mar 18 13:26:05 crc kubenswrapper[4912]: I0318 13:26:05.198007 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:26:05 crc kubenswrapper[4912]: I0318 13:26:05.266849 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-td5nc"] Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.296171 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068dd8a7-2c6a-4da2-abe3-37df375434a5" path="/var/lib/kubelet/pods/068dd8a7-2c6a-4da2-abe3-37df375434a5/volumes" Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.297591 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5e6082-788c-4556-b53c-d4b62b9649db" path="/var/lib/kubelet/pods/1f5e6082-788c-4556-b53c-d4b62b9649db/volumes" Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.300746 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" event={"ID":"366d4bcf-ceeb-48e7-a834-c579166036b6","Type":"ContainerStarted","Data":"8cb27bbe63cf586b29efbafbd56bc38226c8ccf81ec48f1442ed99ae56a48117"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.306972 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerStarted","Data":"9929e970c931615e6a3a5dca44604af246228f805bae09c23f276b6722cbfc7e"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.326460 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" podStartSLOduration=4.868461787 podStartE2EDuration="6.326438449s" podCreationTimestamp="2026-03-18 13:26:00 +0000 UTC" firstStartedPulling="2026-03-18 13:26:01.899287833 +0000 UTC m=+1410.358715258" lastFinishedPulling="2026-03-18 13:26:03.357264495 +0000 UTC m=+1411.816691920" observedRunningTime="2026-03-18 13:26:06.319512793 +0000 UTC m=+1414.778940218" watchObservedRunningTime="2026-03-18 13:26:06.326438449 +0000 UTC m=+1414.785865874" Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.346705 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerStarted","Data":"3bd15fd32f535102af3f4ec1760f5ee401bf9686a976b0cde70be7949ddd1076"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.353241 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" event={"ID":"dadc5395-e931-4293-b037-929db9a9bd99","Type":"ContainerStarted","Data":"0a64d87e0d6fe16eaab283a6cba6329161f876089c59b7d252b8ad343c70a8de"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.354777 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.383537 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerStarted","Data":"41f9612b95cefb9baf2c78de879ad4521cced4fd943888f4542f8d69d60a534a"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.383608 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a406878a-6e90-4c47-8e23-875349b55b1d","Type":"ContainerStarted","Data":"73471b2a92373e6be11ddf99cffe74f537d1ca469644df3892f350ac4fc68e6d"} Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.396119 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" podStartSLOduration=9.396093197999999 podStartE2EDuration="9.396093198s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:06.38200739 +0000 UTC m=+1414.841434855" watchObservedRunningTime="2026-03-18 13:26:06.396093198 +0000 UTC m=+1414.855520623" Mar 18 13:26:06 crc kubenswrapper[4912]: I0318 13:26:06.448678 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.448647849 podStartE2EDuration="18.448647849s" podCreationTimestamp="2026-03-18 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:06.431891099 +0000 UTC m=+1414.891318554" watchObservedRunningTime="2026-03-18 13:26:06.448647849 +0000 UTC m=+1414.908075294" Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.421804 4912 generic.go:334] "Generic (PLEG): container finished" podID="366d4bcf-ceeb-48e7-a834-c579166036b6" containerID="8cb27bbe63cf586b29efbafbd56bc38226c8ccf81ec48f1442ed99ae56a48117" exitCode=0 Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.422023 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" event={"ID":"366d4bcf-ceeb-48e7-a834-c579166036b6","Type":"ContainerDied","Data":"8cb27bbe63cf586b29efbafbd56bc38226c8ccf81ec48f1442ed99ae56a48117"} Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.440354 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerStarted","Data":"f0a7c6d9304dff20cbac7ec7158f523000e966f06c9c7acfa552af6b238c13a2"} Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.440569 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-log" containerID="cri-o://9929e970c931615e6a3a5dca44604af246228f805bae09c23f276b6722cbfc7e" gracePeriod=30 Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.440812 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-httpd" containerID="cri-o://f0a7c6d9304dff20cbac7ec7158f523000e966f06c9c7acfa552af6b238c13a2" gracePeriod=30 Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.457948 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerStarted","Data":"b035e1779cc30685e81c9d0362d700fb2885fddf56c6b78b45272a4a752bb390"} Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.459770 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-log" containerID="cri-o://3bd15fd32f535102af3f4ec1760f5ee401bf9686a976b0cde70be7949ddd1076" gracePeriod=30 Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.460322 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-httpd" containerID="cri-o://b035e1779cc30685e81c9d0362d700fb2885fddf56c6b78b45272a4a752bb390" gracePeriod=30 Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.524447 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.524414573 podStartE2EDuration="10.524414573s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:07.473588259 +0000 UTC m=+1415.933015684" watchObservedRunningTime="2026-03-18 13:26:07.524414573 +0000 UTC m=+1415.983841998" Mar 18 13:26:07 crc kubenswrapper[4912]: I0318 13:26:07.527660 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.527644 podStartE2EDuration="10.527644s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:07.501762545 +0000 UTC m=+1415.961189990" watchObservedRunningTime="2026-03-18 13:26:07.527644 +0000 UTC m=+1415.987071435" Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.505266 4912 generic.go:334] "Generic (PLEG): container finished" podID="e83866d6-0538-4802-868e-eac2411d81db" containerID="f0a7c6d9304dff20cbac7ec7158f523000e966f06c9c7acfa552af6b238c13a2" exitCode=0 Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.505749 4912 generic.go:334] "Generic (PLEG): container finished" podID="e83866d6-0538-4802-868e-eac2411d81db" containerID="9929e970c931615e6a3a5dca44604af246228f805bae09c23f276b6722cbfc7e" exitCode=143 Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.505825 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerDied","Data":"f0a7c6d9304dff20cbac7ec7158f523000e966f06c9c7acfa552af6b238c13a2"} Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.505871 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerDied","Data":"9929e970c931615e6a3a5dca44604af246228f805bae09c23f276b6722cbfc7e"} Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.531373 4912 generic.go:334] "Generic (PLEG): container finished" podID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerID="b035e1779cc30685e81c9d0362d700fb2885fddf56c6b78b45272a4a752bb390" exitCode=0 Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.532433 4912 generic.go:334] "Generic (PLEG): container finished" podID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerID="3bd15fd32f535102af3f4ec1760f5ee401bf9686a976b0cde70be7949ddd1076" exitCode=143 Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.531450 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerDied","Data":"b035e1779cc30685e81c9d0362d700fb2885fddf56c6b78b45272a4a752bb390"} Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.532664 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerDied","Data":"3bd15fd32f535102af3f4ec1760f5ee401bf9686a976b0cde70be7949ddd1076"} Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.537546 4912 generic.go:334] "Generic (PLEG): container finished" podID="7066bcdd-2c08-451c-9985-a103bbd007ab" containerID="454531e67e48c62d206f8691ac34b2f156047af5873d1a3c1c66859c808ee5e9" exitCode=0 Mar 18 13:26:08 crc kubenswrapper[4912]: I0318 13:26:08.539005 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bm267" event={"ID":"7066bcdd-2c08-451c-9985-a103bbd007ab","Type":"ContainerDied","Data":"454531e67e48c62d206f8691ac34b2f156047af5873d1a3c1c66859c808ee5e9"} Mar 18 13:26:09 crc kubenswrapper[4912]: I0318 13:26:09.550095 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.816628 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.821975 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bm267" Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942165 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn8mf\" (UniqueName: \"kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf\") pod \"366d4bcf-ceeb-48e7-a834-c579166036b6\" (UID: \"366d4bcf-ceeb-48e7-a834-c579166036b6\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942463 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942510 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942640 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942780 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftzjc\" (UniqueName: \"kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942889 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:11 crc kubenswrapper[4912]: I0318 13:26:11.942921 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts\") pod \"7066bcdd-2c08-451c-9985-a103bbd007ab\" (UID: \"7066bcdd-2c08-451c-9985-a103bbd007ab\") " Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.003298 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.026160 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts" (OuterVolumeSpecName: "scripts") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.026526 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc" (OuterVolumeSpecName: "kube-api-access-ftzjc") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "kube-api-access-ftzjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.035739 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf" (OuterVolumeSpecName: "kube-api-access-xn8mf") pod "366d4bcf-ceeb-48e7-a834-c579166036b6" (UID: "366d4bcf-ceeb-48e7-a834-c579166036b6"). InnerVolumeSpecName "kube-api-access-xn8mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.044277 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.072901 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftzjc\" (UniqueName: \"kubernetes.io/projected/7066bcdd-2c08-451c-9985-a103bbd007ab-kube-api-access-ftzjc\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.072944 4912 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.072965 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.072976 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn8mf\" (UniqueName: \"kubernetes.io/projected/366d4bcf-ceeb-48e7-a834-c579166036b6-kube-api-access-xn8mf\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.072987 4912 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.088096 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.106015 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data" (OuterVolumeSpecName: "config-data") pod "7066bcdd-2c08-451c-9985-a103bbd007ab" (UID: "7066bcdd-2c08-451c-9985-a103bbd007ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.175825 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.175873 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066bcdd-2c08-451c-9985-a103bbd007ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.627145 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bm267" event={"ID":"7066bcdd-2c08-451c-9985-a103bbd007ab","Type":"ContainerDied","Data":"6ff1a6c726cc8893c8c1c04d8e5688056a877d6c9eb3d67704e4217fd833a240"} Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.627211 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff1a6c726cc8893c8c1c04d8e5688056a877d6c9eb3d67704e4217fd833a240" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.627322 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bm267" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.634399 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" event={"ID":"366d4bcf-ceeb-48e7-a834-c579166036b6","Type":"ContainerDied","Data":"a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b"} Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.634682 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ecdd20d9420de1749bfaef3a44076f431ed829739c810c3fded0750293349b" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.634796 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-nt9g8" Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.935712 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-sn5lp"] Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.950067 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-sn5lp"] Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.966697 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bm267"] Mar 18 13:26:12 crc kubenswrapper[4912]: I0318 13:26:12.978715 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bm267"] Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.060384 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n5nn7"] Mar 18 13:26:13 crc kubenswrapper[4912]: E0318 13:26:13.061058 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366d4bcf-ceeb-48e7-a834-c579166036b6" containerName="oc" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061072 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="366d4bcf-ceeb-48e7-a834-c579166036b6" containerName="oc" Mar 18 13:26:13 crc kubenswrapper[4912]: E0318 13:26:13.061100 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068dd8a7-2c6a-4da2-abe3-37df375434a5" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061107 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="068dd8a7-2c6a-4da2-abe3-37df375434a5" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: E0318 13:26:13.061132 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7066bcdd-2c08-451c-9985-a103bbd007ab" containerName="keystone-bootstrap" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061139 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7066bcdd-2c08-451c-9985-a103bbd007ab" containerName="keystone-bootstrap" Mar 18 13:26:13 crc kubenswrapper[4912]: E0318 13:26:13.061155 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5e6082-788c-4556-b53c-d4b62b9649db" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061162 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5e6082-788c-4556-b53c-d4b62b9649db" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061383 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="366d4bcf-ceeb-48e7-a834-c579166036b6" containerName="oc" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061402 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5e6082-788c-4556-b53c-d4b62b9649db" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061413 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="068dd8a7-2c6a-4da2-abe3-37df375434a5" containerName="init" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.061429 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7066bcdd-2c08-451c-9985-a103bbd007ab" containerName="keystone-bootstrap" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.062408 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.065084 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.065389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.065389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.065535 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knn24" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.069929 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.091292 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n5nn7"] Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211208 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211265 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7gz\" (UniqueName: \"kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211321 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211367 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211406 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.211510 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314623 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314758 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314884 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314909 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7gz\" (UniqueName: \"kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314949 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.314992 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.321946 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.322801 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.324230 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.333933 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.438119 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.439359 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7gz\" (UniqueName: \"kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz\") pod \"keystone-bootstrap-n5nn7\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.683780 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:13 crc kubenswrapper[4912]: I0318 13:26:13.983868 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.113901 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.114319 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" containerID="cri-o://fe4129bd358e1716fa75b4bc44fa8f135256f852b8387ebeb8d3fbc096d30410" gracePeriod=10 Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.252991 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7066bcdd-2c08-451c-9985-a103bbd007ab" path="/var/lib/kubelet/pods/7066bcdd-2c08-451c-9985-a103bbd007ab/volumes" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.253571 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59fbd6e-ada8-4bbc-bfcc-e80a464664f9" path="/var/lib/kubelet/pods/e59fbd6e-ada8-4bbc-bfcc-e80a464664f9/volumes" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.388378 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.398674 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.471122 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmvlp\" (UniqueName: \"kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.471586 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.471660 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.471746 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.471816 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.474723 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs" (OuterVolumeSpecName: "logs") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.475375 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdmrn\" (UniqueName: \"kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.475410 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.475681 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.475865 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.475946 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.476193 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.476293 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.476405 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts\") pod \"e83866d6-0538-4802-868e-eac2411d81db\" (UID: \"e83866d6-0538-4802-868e-eac2411d81db\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.476568 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\" (UID: \"f288c82f-0efc-4acd-bd6f-e60b60b7030e\") " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.478992 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.479113 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.480585 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp" (OuterVolumeSpecName: "kube-api-access-lmvlp") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "kube-api-access-lmvlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.480923 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs" (OuterVolumeSpecName: "logs") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.481478 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.483134 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmvlp\" (UniqueName: \"kubernetes.io/projected/e83866d6-0538-4802-868e-eac2411d81db-kube-api-access-lmvlp\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.483236 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.483307 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83866d6-0538-4802-868e-eac2411d81db-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.483418 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f288c82f-0efc-4acd-bd6f-e60b60b7030e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.489412 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts" (OuterVolumeSpecName: "scripts") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.497423 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts" (OuterVolumeSpecName: "scripts") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.505469 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn" (OuterVolumeSpecName: "kube-api-access-qdmrn") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "kube-api-access-qdmrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.513693 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702" (OuterVolumeSpecName: "glance") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "pvc-7a77a725-7984-4a1f-952c-59e02a3e3702". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.527801 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.549745 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.549951 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a" (OuterVolumeSpecName: "glance") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.580632 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data" (OuterVolumeSpecName: "config-data") pod "f288c82f-0efc-4acd-bd6f-e60b60b7030e" (UID: "f288c82f-0efc-4acd-bd6f-e60b60b7030e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.583545 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data" (OuterVolumeSpecName: "config-data") pod "e83866d6-0538-4802-868e-eac2411d81db" (UID: "e83866d6-0538-4802-868e-eac2411d81db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591374 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591424 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdmrn\" (UniqueName: \"kubernetes.io/projected/f288c82f-0efc-4acd-bd6f-e60b60b7030e-kube-api-access-qdmrn\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591442 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591501 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") on node \"crc\" " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591520 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591535 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83866d6-0538-4802-868e-eac2411d81db-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591561 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") on node \"crc\" " Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591576 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.591591 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f288c82f-0efc-4acd-bd6f-e60b60b7030e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.628082 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.628398 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a") on node "crc" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.628604 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.628729 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7a77a725-7984-4a1f-952c-59e02a3e3702" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702") on node "crc" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.694494 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.694527 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.701862 4912 generic.go:334] "Generic (PLEG): container finished" podID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerID="fe4129bd358e1716fa75b4bc44fa8f135256f852b8387ebeb8d3fbc096d30410" exitCode=0 Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.701935 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" event={"ID":"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452","Type":"ContainerDied","Data":"fe4129bd358e1716fa75b4bc44fa8f135256f852b8387ebeb8d3fbc096d30410"} Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.704700 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83866d6-0538-4802-868e-eac2411d81db","Type":"ContainerDied","Data":"3cbe4bcd0742839b3422ce810c1f0d8c11a782a7649bd5cdcd347ba538348b4b"} Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.704746 4912 scope.go:117] "RemoveContainer" containerID="f0a7c6d9304dff20cbac7ec7158f523000e966f06c9c7acfa552af6b238c13a2" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.704897 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.721761 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f288c82f-0efc-4acd-bd6f-e60b60b7030e","Type":"ContainerDied","Data":"1a3da77c4a00a1566f180c11c497bde7fe9cd7ba7e0f9fe9aceb9742693f8415"} Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.721862 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.771910 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.794261 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.813456 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.835314 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.854850 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: E0318 13:26:14.855667 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.855695 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: E0318 13:26:14.855713 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.855722 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: E0318 13:26:14.855739 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.855748 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: E0318 13:26:14.855825 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.855835 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.856268 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.856296 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-log" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.856306 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83866d6-0538-4802-868e-eac2411d81db" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.856324 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" containerName="glance-httpd" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.858357 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.861088 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lm6d9" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.866554 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.866958 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.867111 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.896154 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.909223 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.912149 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.915572 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.915943 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:26:14 crc kubenswrapper[4912]: I0318 13:26:14.922301 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002559 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002651 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vnx8\" (UniqueName: \"kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002673 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002701 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002738 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002887 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002941 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002965 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.002987 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003060 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rdk\" (UniqueName: \"kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003245 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003472 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003518 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003626 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003661 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.003743 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.105922 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vnx8\" (UniqueName: \"kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106387 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106433 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106472 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106513 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106611 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106661 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rdk\" (UniqueName: \"kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106714 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106783 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106809 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106854 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106878 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106916 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.106941 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.107867 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.111828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.113696 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.115499 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.115891 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.116752 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.117685 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.119508 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.120544 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.120575 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a72a99aa2c60b032f6beea238cfa834e3b34d6f5ea6f566ee310bef28b7d80e1/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.120563 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.120757 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.121357 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.122315 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.122351 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26f2544761a6fbc0806c14fe58682cf74b4da7181bc2e0537e305660953f7255/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.123308 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.133648 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vnx8\" (UniqueName: \"kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.134586 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rdk\" (UniqueName: \"kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.193485 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.194311 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.248328 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.495265 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:15 crc kubenswrapper[4912]: I0318 13:26:15.710919 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 18 13:26:16 crc kubenswrapper[4912]: I0318 13:26:16.243532 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83866d6-0538-4802-868e-eac2411d81db" path="/var/lib/kubelet/pods/e83866d6-0538-4802-868e-eac2411d81db/volumes" Mar 18 13:26:16 crc kubenswrapper[4912]: I0318 13:26:16.244631 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f288c82f-0efc-4acd-bd6f-e60b60b7030e" path="/var/lib/kubelet/pods/f288c82f-0efc-4acd-bd6f-e60b60b7030e/volumes" Mar 18 13:26:17 crc kubenswrapper[4912]: E0318 13:26:17.670719 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 18 13:26:17 crc kubenswrapper[4912]: E0318 13:26:17.674806 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n686h7fh555h5d7h545h5dch58h5fh695h75h59h68h9dh5b7hd6h57ch8h8bh666h5b8h556h7ch5d4hd4hbfh97h665h687h5c5h5d4h688h574q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9l5r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:26:19 crc kubenswrapper[4912]: I0318 13:26:19.556301 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 13:26:19 crc kubenswrapper[4912]: I0318 13:26:19.586333 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 13:26:19 crc kubenswrapper[4912]: I0318 13:26:19.794721 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 13:26:20 crc kubenswrapper[4912]: I0318 13:26:20.714345 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 18 13:26:23 crc kubenswrapper[4912]: E0318 13:26:23.222934 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 18 13:26:23 crc kubenswrapper[4912]: E0318 13:26:23.223519 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9ts8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-j4dhx_openstack(539c384f-3502-4474-9c40-432909696dfb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:26:23 crc kubenswrapper[4912]: E0318 13:26:23.225341 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-j4dhx" podUID="539c384f-3502-4474-9c40-432909696dfb" Mar 18 13:26:23 crc kubenswrapper[4912]: E0318 13:26:23.853504 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-j4dhx" podUID="539c384f-3502-4474-9c40-432909696dfb" Mar 18 13:26:25 crc kubenswrapper[4912]: I0318 13:26:25.711836 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 18 13:26:25 crc kubenswrapper[4912]: I0318 13:26:25.712588 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.166305 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.171663 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.257301 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.294874 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.295081 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt66x\" (UniqueName: \"kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.295547 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.398159 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt66x\" (UniqueName: \"kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.398283 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.398474 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.400114 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.400833 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.450017 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt66x\" (UniqueName: \"kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x\") pod \"certified-operators-rvmbk\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:28 crc kubenswrapper[4912]: I0318 13:26:28.505487 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.529910 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.533882 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.543702 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.669672 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.669743 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.670780 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8znv\" (UniqueName: \"kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.711097 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.773026 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8znv\" (UniqueName: \"kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.773112 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.773141 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.773856 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.774466 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.801008 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8znv\" (UniqueName: \"kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv\") pod \"redhat-operators-rqm8k\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.866966 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.939867 4912 generic.go:334] "Generic (PLEG): container finished" podID="39c9cb6b-7493-434e-b069-61ce32dcdc95" containerID="034f5fad2378d4be86d7c75fbbce5883543c7719a8084de0e56ed92b60e91656" exitCode=0 Mar 18 13:26:30 crc kubenswrapper[4912]: I0318 13:26:30.939923 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6zf76" event={"ID":"39c9cb6b-7493-434e-b069-61ce32dcdc95","Type":"ContainerDied","Data":"034f5fad2378d4be86d7c75fbbce5883543c7719a8084de0e56ed92b60e91656"} Mar 18 13:26:34 crc kubenswrapper[4912]: E0318 13:26:34.758711 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 13:26:34 crc kubenswrapper[4912]: E0318 13:26:34.759875 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwvpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-d7x2r_openstack(e7732fd2-b813-47e5-8f23-823a3037df09): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:26:34 crc kubenswrapper[4912]: E0318 13:26:34.761241 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-d7x2r" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" Mar 18 13:26:34 crc kubenswrapper[4912]: I0318 13:26:34.800925 4912 scope.go:117] "RemoveContainer" containerID="9929e970c931615e6a3a5dca44604af246228f805bae09c23f276b6722cbfc7e" Mar 18 13:26:34 crc kubenswrapper[4912]: I0318 13:26:34.910952 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6zf76" Mar 18 13:26:34 crc kubenswrapper[4912]: I0318 13:26:34.998971 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rccx\" (UniqueName: \"kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx\") pod \"39c9cb6b-7493-434e-b069-61ce32dcdc95\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.000744 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config\") pod \"39c9cb6b-7493-434e-b069-61ce32dcdc95\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.001239 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle\") pod \"39c9cb6b-7493-434e-b069-61ce32dcdc95\" (UID: \"39c9cb6b-7493-434e-b069-61ce32dcdc95\") " Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.008087 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6zf76" event={"ID":"39c9cb6b-7493-434e-b069-61ce32dcdc95","Type":"ContainerDied","Data":"f9ebd39260084d1ab145c9a58f1754595890a014688022cead65666426eaa466"} Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.008142 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ebd39260084d1ab145c9a58f1754595890a014688022cead65666426eaa466" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.008179 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6zf76" Mar 18 13:26:35 crc kubenswrapper[4912]: E0318 13:26:35.015424 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-d7x2r" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.035815 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx" (OuterVolumeSpecName: "kube-api-access-8rccx") pod "39c9cb6b-7493-434e-b069-61ce32dcdc95" (UID: "39c9cb6b-7493-434e-b069-61ce32dcdc95"). InnerVolumeSpecName "kube-api-access-8rccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.046452 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39c9cb6b-7493-434e-b069-61ce32dcdc95" (UID: "39c9cb6b-7493-434e-b069-61ce32dcdc95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.048150 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config" (OuterVolumeSpecName: "config") pod "39c9cb6b-7493-434e-b069-61ce32dcdc95" (UID: "39c9cb6b-7493-434e-b069-61ce32dcdc95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.105558 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.105606 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c9cb6b-7493-434e-b069-61ce32dcdc95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:35 crc kubenswrapper[4912]: I0318 13:26:35.105620 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rccx\" (UniqueName: \"kubernetes.io/projected/39c9cb6b-7493-434e-b069-61ce32dcdc95-kube-api-access-8rccx\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:35 crc kubenswrapper[4912]: E0318 13:26:35.993072 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 13:26:35 crc kubenswrapper[4912]: E0318 13:26:35.993353 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glpg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fbf2l_openstack(147c4d2b-19d3-48da-9364-c527a1cacc3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:26:35 crc kubenswrapper[4912]: E0318 13:26:35.995355 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fbf2l" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.030379 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" event={"ID":"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452","Type":"ContainerDied","Data":"ddcb124e0d1b6d9117075252783c0755546a3d1d65275ec222072e4d11708200"} Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.030832 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddcb124e0d1b6d9117075252783c0755546a3d1d65275ec222072e4d11708200" Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.032470 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-fbf2l" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.216407 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261113 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261212 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261497 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261591 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261681 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.261714 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcwnw\" (UniqueName: \"kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw\") pod \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\" (UID: \"5307847c-fcc1-4e69-b2fc-3c0a7eb3c452\") " Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.278611 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw" (OuterVolumeSpecName: "kube-api-access-hcwnw") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "kube-api-access-hcwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.365162 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcwnw\" (UniqueName: \"kubernetes.io/projected/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-kube-api-access-hcwnw\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.408172 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.447339 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.454733 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config" (OuterVolumeSpecName: "config") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.469266 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.469316 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.469331 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.475898 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.484964 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" (UID: "5307847c-fcc1-4e69-b2fc-3c0a7eb3c452"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.562681 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.563596 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c9cb6b-7493-434e-b069-61ce32dcdc95" containerName="neutron-db-sync" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.563620 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c9cb6b-7493-434e-b069-61ce32dcdc95" containerName="neutron-db-sync" Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.563640 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="init" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.563649 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="init" Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.563707 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.563716 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.564596 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c9cb6b-7493-434e-b069-61ce32dcdc95" containerName="neutron-db-sync" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.564658 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.566761 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.566796 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.567086 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.569054 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.569184 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.572906 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.572945 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.578889 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.579325 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.579506 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.579668 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k5qfz" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.673272 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n5nn7"] Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675136 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675183 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675220 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675243 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675301 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2g6\" (UniqueName: \"kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675330 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675558 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jml2\" (UniqueName: \"kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.675674 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.676165 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.676296 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.676731 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.779999 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.780231 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.780295 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.780407 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781489 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781593 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781627 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781700 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781737 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781755 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781807 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2g6\" (UniqueName: \"kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.781876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jml2\" (UniqueName: \"kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.782257 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.783552 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.783874 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.786676 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.787293 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.791164 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.792290 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.804259 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jml2\" (UniqueName: \"kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2\") pod \"neutron-bcdccd79d-bvwsd\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.805392 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2g6\" (UniqueName: \"kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6\") pod \"dnsmasq-dns-6b7b667979-zqbcj\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.909142 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.912735 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Mar 18 13:26:36 crc kubenswrapper[4912]: E0318 13:26:36.913199 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n686h7fh555h5d7h545h5dch58h5fh695h75h59h68h9dh5b7hd6h57ch8h8bh666h5b8h556h7ch5d4hd4hbfh97h665h687h5c5h5d4h688h574q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9l5r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.918783 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:36 crc kubenswrapper[4912]: I0318 13:26:36.971245 4912 scope.go:117] "RemoveContainer" containerID="b035e1779cc30685e81c9d0362d700fb2885fddf56c6b78b45272a4a752bb390" Mar 18 13:26:36 crc kubenswrapper[4912]: W0318 13:26:36.975636 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99d3f399_3f5f_4a3a_a7d2_0f9677a4408f.slice/crio-5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c WatchSource:0}: Error finding container 5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c: Status 404 returned error can't find the container with id 5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.006811 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.079272 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.091466 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n5nn7" event={"ID":"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f","Type":"ContainerStarted","Data":"5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c"} Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.176756 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.193319 4912 scope.go:117] "RemoveContainer" containerID="3bd15fd32f535102af3f4ec1760f5ee401bf9686a976b0cde70be7949ddd1076" Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.208083 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-s6l7z"] Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.713021 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.837571 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:26:37 crc kubenswrapper[4912]: I0318 13:26:37.896472 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.052002 4912 scope.go:117] "RemoveContainer" containerID="2f750434743d18b9fb3630441917dd76a707a124f6254124e0111e1f9bb0f3e6" Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.113219 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerStarted","Data":"87ad7cfb29fe239b2c0f88dea6382c3d0d3aa0ff7ec626976e4287bcdb57293a"} Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.115760 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerStarted","Data":"17e992449c8c65254f1a68e296534fa2d54b3b6ea3494e5c5a8ae94577f46005"} Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.120244 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerStarted","Data":"c68ff814631878a519de1a06ad7f6c8d1d17fd7c979646aab90c964df046a9a7"} Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.220643 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.255590 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" path="/var/lib/kubelet/pods/5307847c-fcc1-4e69-b2fc-3c0a7eb3c452/volumes" Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.349497 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:26:38 crc kubenswrapper[4912]: I0318 13:26:38.397190 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:26:38 crc kubenswrapper[4912]: W0318 13:26:38.489725 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d0e89e_66f1_4da6_b974_4ab3c60b3520.slice/crio-5fbc86b4ea71b0a9a323ae52dcdee82592383b115430fdab475e8d3d8b6256d1 WatchSource:0}: Error finding container 5fbc86b4ea71b0a9a323ae52dcdee82592383b115430fdab475e8d3d8b6256d1: Status 404 returned error can't find the container with id 5fbc86b4ea71b0a9a323ae52dcdee82592383b115430fdab475e8d3d8b6256d1 Mar 18 13:26:38 crc kubenswrapper[4912]: W0318 13:26:38.515615 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3df1cac_4f07_4be6_9c80_dcdd7b6c1967.slice/crio-4c06223e1c3f08f0c74c841cf77654c2c44a1308255b110c4c6bdcf3173acce2 WatchSource:0}: Error finding container 4c06223e1c3f08f0c74c841cf77654c2c44a1308255b110c4c6bdcf3173acce2: Status 404 returned error can't find the container with id 4c06223e1c3f08f0c74c841cf77654c2c44a1308255b110c4c6bdcf3173acce2 Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.092265 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.097219 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.102898 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.103170 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.106736 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.186482 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" event={"ID":"66d0e89e-66f1-4da6-b974-4ab3c60b3520","Type":"ContainerStarted","Data":"5fbc86b4ea71b0a9a323ae52dcdee82592383b115430fdab475e8d3d8b6256d1"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.191228 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerStarted","Data":"4c06223e1c3f08f0c74c841cf77654c2c44a1308255b110c4c6bdcf3173acce2"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.206940 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207058 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207136 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207173 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbx4\" (UniqueName: \"kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207208 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207265 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.207320 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.217074 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4dhx" event={"ID":"539c384f-3502-4474-9c40-432909696dfb","Type":"ContainerStarted","Data":"721e891730d0796f3d304ba629af0751a53659d4ba1dab881f94538eb2e1c84e"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.231097 4912 generic.go:334] "Generic (PLEG): container finished" podID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerID="b6a5ecd7ae529677c227d79a14d7f9a37439dc4dd0e81e0ca38592882679895d" exitCode=0 Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.233869 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerDied","Data":"b6a5ecd7ae529677c227d79a14d7f9a37439dc4dd0e81e0ca38592882679895d"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.233938 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerStarted","Data":"dfd61faefc60713ecbb71ac2dd5a2fb6b26c6b4bd12b39531bc0b5cfd9337518"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.250681 4912 generic.go:334] "Generic (PLEG): container finished" podID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerID="57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e" exitCode=0 Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.250760 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerDied","Data":"57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.254754 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n5nn7" event={"ID":"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f","Type":"ContainerStarted","Data":"610b5de5bf63bea5146a58b8d390f9f95719eb3d511de8405fc700da8a0f8988"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.272183 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tlb72" event={"ID":"a4cdd8f3-f85b-4a98-a164-bc9462b4932f","Type":"ContainerStarted","Data":"cfc4b092fd8cb092417aff68a13ee61fb9866ebb8bb44cd86f45e2a2cca68065"} Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.296251 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-j4dhx" podStartSLOduration=4.915596181 podStartE2EDuration="42.296225419s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="2026-03-18 13:25:59.872892474 +0000 UTC m=+1408.332319899" lastFinishedPulling="2026-03-18 13:26:37.253521702 +0000 UTC m=+1445.712949137" observedRunningTime="2026-03-18 13:26:39.241264783 +0000 UTC m=+1447.700692238" watchObservedRunningTime="2026-03-18 13:26:39.296225419 +0000 UTC m=+1447.755652844" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.309612 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.309736 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.310314 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.310490 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.310642 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.310724 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbx4\" (UniqueName: \"kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.310812 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.324960 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.330892 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.333955 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.336563 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbx4\" (UniqueName: \"kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.349443 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.350856 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.354901 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs\") pod \"neutron-69b7f8c44f-4zs4t\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.373567 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n5nn7" podStartSLOduration=26.373538044 podStartE2EDuration="26.373538044s" podCreationTimestamp="2026-03-18 13:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:39.291487581 +0000 UTC m=+1447.750915016" watchObservedRunningTime="2026-03-18 13:26:39.373538044 +0000 UTC m=+1447.832965469" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.394023 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-tlb72" podStartSLOduration=7.699857592 podStartE2EDuration="42.393996093s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="2026-03-18 13:26:01.277245058 +0000 UTC m=+1409.736672483" lastFinishedPulling="2026-03-18 13:26:35.971383549 +0000 UTC m=+1444.430810984" observedRunningTime="2026-03-18 13:26:39.342524731 +0000 UTC m=+1447.801952156" watchObservedRunningTime="2026-03-18 13:26:39.393996093 +0000 UTC m=+1447.853423518" Mar 18 13:26:39 crc kubenswrapper[4912]: I0318 13:26:39.492313 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.321471 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerStarted","Data":"c02851869e590595fd381d5a03d0cba2f3ba9966778d483d5c2d3e985081f5c2"} Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.322383 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerStarted","Data":"8bfe9b0ce31606171c776b87ff85fb45661e3ca780a21a8abe4ed0eedfd5ee11"} Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.322546 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.328813 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerStarted","Data":"42e0127644f532b348025f7df7619a404bc946dab2301b65068c037fe8bc9f8d"} Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.341249 4912 generic.go:334] "Generic (PLEG): container finished" podID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerID="0235ce3e0286fa2bfa42a9399dc6d10ab8c54015656606c3f7805aa2190686da" exitCode=0 Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.341386 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" event={"ID":"66d0e89e-66f1-4da6-b974-4ab3c60b3520","Type":"ContainerDied","Data":"0235ce3e0286fa2bfa42a9399dc6d10ab8c54015656606c3f7805aa2190686da"} Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.351236 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bcdccd79d-bvwsd" podStartSLOduration=4.351208065 podStartE2EDuration="4.351208065s" podCreationTimestamp="2026-03-18 13:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:40.348938874 +0000 UTC m=+1448.808366309" watchObservedRunningTime="2026-03-18 13:26:40.351208065 +0000 UTC m=+1448.810635490" Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.375218 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerStarted","Data":"fea833f54f21a2c58861ffc97de67f6c570ec8fd3429ab4275e58e62000dc932"} Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.419247 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:26:40 crc kubenswrapper[4912]: I0318 13:26:40.711012 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-s6l7z" podUID="5307847c-fcc1-4e69-b2fc-3c0a7eb3c452" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: i/o timeout" Mar 18 13:26:41 crc kubenswrapper[4912]: I0318 13:26:41.391876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerStarted","Data":"21b1895d5ed7473eaf8214266c750af09941bf2d62088f819e99cfd42d5fd80d"} Mar 18 13:26:44 crc kubenswrapper[4912]: I0318 13:26:44.430468 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerStarted","Data":"f5801bb0293739b2976ca898be44b5d6224ac06ff96f49158776c37517b87d88"} Mar 18 13:26:44 crc kubenswrapper[4912]: I0318 13:26:44.440176 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerStarted","Data":"00eca2aec65f34cd0d48bc6e825d46de32abceac203177a231daac5d742dfb97"} Mar 18 13:26:44 crc kubenswrapper[4912]: I0318 13:26:44.468783 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.468751461 podStartE2EDuration="30.468751461s" podCreationTimestamp="2026-03-18 13:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:44.464444846 +0000 UTC m=+1452.923872281" watchObservedRunningTime="2026-03-18 13:26:44.468751461 +0000 UTC m=+1452.928178886" Mar 18 13:26:44 crc kubenswrapper[4912]: I0318 13:26:44.501928 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=30.501900221 podStartE2EDuration="30.501900221s" podCreationTimestamp="2026-03-18 13:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:44.491937474 +0000 UTC m=+1452.951364919" watchObservedRunningTime="2026-03-18 13:26:44.501900221 +0000 UTC m=+1452.961327646" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.249863 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.250280 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.250292 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.250301 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.296553 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.305737 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.496294 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.496413 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.496432 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.496527 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.546689 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:45 crc kubenswrapper[4912]: I0318 13:26:45.563082 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:46 crc kubenswrapper[4912]: I0318 13:26:46.465337 4912 generic.go:334] "Generic (PLEG): container finished" podID="a4cdd8f3-f85b-4a98-a164-bc9462b4932f" containerID="cfc4b092fd8cb092417aff68a13ee61fb9866ebb8bb44cd86f45e2a2cca68065" exitCode=0 Mar 18 13:26:46 crc kubenswrapper[4912]: I0318 13:26:46.465802 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tlb72" event={"ID":"a4cdd8f3-f85b-4a98-a164-bc9462b4932f","Type":"ContainerDied","Data":"cfc4b092fd8cb092417aff68a13ee61fb9866ebb8bb44cd86f45e2a2cca68065"} Mar 18 13:26:46 crc kubenswrapper[4912]: I0318 13:26:46.469865 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n5nn7" event={"ID":"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f","Type":"ContainerDied","Data":"610b5de5bf63bea5146a58b8d390f9f95719eb3d511de8405fc700da8a0f8988"} Mar 18 13:26:46 crc kubenswrapper[4912]: I0318 13:26:46.469812 4912 generic.go:334] "Generic (PLEG): container finished" podID="99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" containerID="610b5de5bf63bea5146a58b8d390f9f95719eb3d511de8405fc700da8a0f8988" exitCode=0 Mar 18 13:26:47 crc kubenswrapper[4912]: I0318 13:26:47.489344 4912 generic.go:334] "Generic (PLEG): container finished" podID="539c384f-3502-4474-9c40-432909696dfb" containerID="721e891730d0796f3d304ba629af0751a53659d4ba1dab881f94538eb2e1c84e" exitCode=0 Mar 18 13:26:47 crc kubenswrapper[4912]: I0318 13:26:47.489409 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4dhx" event={"ID":"539c384f-3502-4474-9c40-432909696dfb","Type":"ContainerDied","Data":"721e891730d0796f3d304ba629af0751a53659d4ba1dab881f94538eb2e1c84e"} Mar 18 13:26:48 crc kubenswrapper[4912]: I0318 13:26:48.980402 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tlb72" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.027091 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.058620 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs\") pod \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.059032 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle\") pod \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.059246 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts\") pod \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.059279 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data\") pod \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.059360 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9nrt\" (UniqueName: \"kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt\") pod \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\" (UID: \"a4cdd8f3-f85b-4a98-a164-bc9462b4932f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.076980 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs" (OuterVolumeSpecName: "logs") pod "a4cdd8f3-f85b-4a98-a164-bc9462b4932f" (UID: "a4cdd8f3-f85b-4a98-a164-bc9462b4932f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.123452 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt" (OuterVolumeSpecName: "kube-api-access-z9nrt") pod "a4cdd8f3-f85b-4a98-a164-bc9462b4932f" (UID: "a4cdd8f3-f85b-4a98-a164-bc9462b4932f"). InnerVolumeSpecName "kube-api-access-z9nrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.123413 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts" (OuterVolumeSpecName: "scripts") pod "a4cdd8f3-f85b-4a98-a164-bc9462b4932f" (UID: "a4cdd8f3-f85b-4a98-a164-bc9462b4932f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.158366 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4dhx" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.160702 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4cdd8f3-f85b-4a98-a164-bc9462b4932f" (UID: "a4cdd8f3-f85b-4a98-a164-bc9462b4932f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.161674 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.161816 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.161909 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq7gz\" (UniqueName: \"kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.162109 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.162173 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.162305 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys\") pod \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\" (UID: \"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.163204 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.163226 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.163238 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9nrt\" (UniqueName: \"kubernetes.io/projected/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-kube-api-access-z9nrt\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.163250 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.171330 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.191685 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz" (OuterVolumeSpecName: "kube-api-access-qq7gz") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "kube-api-access-qq7gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.193630 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts" (OuterVolumeSpecName: "scripts") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.220488 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.264925 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data\") pod \"539c384f-3502-4474-9c40-432909696dfb\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.265009 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle\") pod \"539c384f-3502-4474-9c40-432909696dfb\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.265159 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9ts8\" (UniqueName: \"kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8\") pod \"539c384f-3502-4474-9c40-432909696dfb\" (UID: \"539c384f-3502-4474-9c40-432909696dfb\") " Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.265989 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.266024 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq7gz\" (UniqueName: \"kubernetes.io/projected/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-kube-api-access-qq7gz\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.266052 4912 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.266064 4912 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.289560 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8" (OuterVolumeSpecName: "kube-api-access-q9ts8") pod "539c384f-3502-4474-9c40-432909696dfb" (UID: "539c384f-3502-4474-9c40-432909696dfb"). InnerVolumeSpecName "kube-api-access-q9ts8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.335110 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data" (OuterVolumeSpecName: "config-data") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.368284 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9ts8\" (UniqueName: \"kubernetes.io/projected/539c384f-3502-4474-9c40-432909696dfb-kube-api-access-q9ts8\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.368320 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.375644 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data" (OuterVolumeSpecName: "config-data") pod "a4cdd8f3-f85b-4a98-a164-bc9462b4932f" (UID: "a4cdd8f3-f85b-4a98-a164-bc9462b4932f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.382630 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" (UID: "99d3f399-3f5f-4a3a-a7d2-0f9677a4408f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.400471 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "539c384f-3502-4474-9c40-432909696dfb" (UID: "539c384f-3502-4474-9c40-432909696dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.462112 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data" (OuterVolumeSpecName: "config-data") pod "539c384f-3502-4474-9c40-432909696dfb" (UID: "539c384f-3502-4474-9c40-432909696dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.477447 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.477642 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539c384f-3502-4474-9c40-432909696dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.477727 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cdd8f3-f85b-4a98-a164-bc9462b4932f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.477788 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.520647 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerStarted","Data":"19564a37a9b24261d7c9934eb493448b57a6b3b15e9b7b8a2666172c28b5565d"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.527420 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n5nn7" event={"ID":"99d3f399-3f5f-4a3a-a7d2-0f9677a4408f","Type":"ContainerDied","Data":"5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.527472 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a669dc649036005f7216623d5cd13cfc098df9a16be1d2102ce4c69cf14683c" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.527546 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n5nn7" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.564953 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerStarted","Data":"479c70d89d7e033b8d2e89e21de1199ed685ea2ffe4daaf49022e1097269e964"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.576833 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tlb72" event={"ID":"a4cdd8f3-f85b-4a98-a164-bc9462b4932f","Type":"ContainerDied","Data":"503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.576895 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="503b6c4271a275d0f1796fe1dd709664811aecd6070c2967aab36869c2deb5e9" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.577028 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tlb72" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.597858 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" event={"ID":"66d0e89e-66f1-4da6-b974-4ab3c60b3520","Type":"ContainerStarted","Data":"573326ca6595d80b8945733ff296bc5a27913cf458e237a64a43b1ddc6cc7109"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.598541 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.618223 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerStarted","Data":"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.636689 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4dhx" event={"ID":"539c384f-3502-4474-9c40-432909696dfb","Type":"ContainerDied","Data":"2a08a6f6f90c5430e8a4da755dee841ff7eebb140c794997f39763059563d0c4"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.636746 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a08a6f6f90c5430e8a4da755dee841ff7eebb140c794997f39763059563d0c4" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.636859 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4dhx" Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.657134 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerStarted","Data":"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035"} Mar 18 13:26:49 crc kubenswrapper[4912]: I0318 13:26:49.664436 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" podStartSLOduration=13.664404983 podStartE2EDuration="13.664404983s" podCreationTimestamp="2026-03-18 13:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:49.629586869 +0000 UTC m=+1458.089014324" watchObservedRunningTime="2026-03-18 13:26:49.664404983 +0000 UTC m=+1458.123832408" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.298317 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79d7d9b7f7-jbmpn"] Mar 18 13:26:50 crc kubenswrapper[4912]: E0318 13:26:50.302377 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" containerName="keystone-bootstrap" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302409 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" containerName="keystone-bootstrap" Mar 18 13:26:50 crc kubenswrapper[4912]: E0318 13:26:50.302452 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539c384f-3502-4474-9c40-432909696dfb" containerName="heat-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302464 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="539c384f-3502-4474-9c40-432909696dfb" containerName="heat-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: E0318 13:26:50.302507 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cdd8f3-f85b-4a98-a164-bc9462b4932f" containerName="placement-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302513 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cdd8f3-f85b-4a98-a164-bc9462b4932f" containerName="placement-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302851 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" containerName="keystone-bootstrap" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302860 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="539c384f-3502-4474-9c40-432909696dfb" containerName="heat-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.302876 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cdd8f3-f85b-4a98-a164-bc9462b4932f" containerName="placement-db-sync" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.303828 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.309127 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.311853 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.312724 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.312947 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.315646 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.323581 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqbzz" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.323921 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324098 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324205 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324313 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knn24" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324662 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324800 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.324321 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.325479 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-scripts\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.325535 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-config-data\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.325583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-public-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.325786 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-combined-ca-bundle\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.326165 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf9h\" (UniqueName: \"kubernetes.io/projected/5c102a6b-029b-47f9-bdd5-66fb03606564-kube-api-access-kcf9h\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.326367 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-internal-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.326456 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-credential-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.326762 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-fernet-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.339175 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79d7d9b7f7-jbmpn"] Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.382078 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.431500 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-combined-ca-bundle\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.431596 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.431627 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf9h\" (UniqueName: \"kubernetes.io/projected/5c102a6b-029b-47f9-bdd5-66fb03606564-kube-api-access-kcf9h\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.432876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-internal-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.432922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.432964 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-credential-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.432998 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433020 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433124 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433169 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889zf\" (UniqueName: \"kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433236 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-fernet-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433380 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-scripts\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433399 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-config-data\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433431 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-public-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.433462 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.438177 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-internal-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.442983 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-config-data\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.447752 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-credential-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.448709 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-fernet-keys\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.460475 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-public-tls-certs\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.460524 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-scripts\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.460837 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c102a6b-029b-47f9-bdd5-66fb03606564-combined-ca-bundle\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.465570 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf9h\" (UniqueName: \"kubernetes.io/projected/5c102a6b-029b-47f9-bdd5-66fb03606564-kube-api-access-kcf9h\") pod \"keystone-79d7d9b7f7-jbmpn\" (UID: \"5c102a6b-029b-47f9-bdd5-66fb03606564\") " pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542351 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542542 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542631 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542652 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542679 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.542718 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889zf\" (UniqueName: \"kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.545466 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.550913 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.552585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.552751 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.554328 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.557598 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.568388 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889zf\" (UniqueName: \"kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf\") pod \"placement-545f7ccb8-vvt42\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.618777 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f6bb888-s77j2"] Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.621970 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.644417 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f6bb888-s77j2"] Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.674864 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerStarted","Data":"9ce013bedffcdb061ece7ba64de1d7a0ecab72853f02f5494a44e429e88dcb81"} Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.675178 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.693779 4912 generic.go:334] "Generic (PLEG): container finished" podID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerID="479c70d89d7e033b8d2e89e21de1199ed685ea2ffe4daaf49022e1097269e964" exitCode=0 Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.694015 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerDied","Data":"479c70d89d7e033b8d2e89e21de1199ed685ea2ffe4daaf49022e1097269e964"} Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.698898 4912 generic.go:334] "Generic (PLEG): container finished" podID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerID="e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035" exitCode=0 Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.699090 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerDied","Data":"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035"} Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.700665 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.716992 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.724723 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69b7f8c44f-4zs4t" podStartSLOduration=11.724663791 podStartE2EDuration="11.724663791s" podCreationTimestamp="2026-03-18 13:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:50.710392098 +0000 UTC m=+1459.169819533" watchObservedRunningTime="2026-03-18 13:26:50.724663791 +0000 UTC m=+1459.184091216" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.751576 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-config-data\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.751769 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27cc\" (UniqueName: \"kubernetes.io/projected/7dda5aa6-ee95-4b75-9204-b014aba202ae-kube-api-access-w27cc\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.751881 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-scripts\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.752106 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-public-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.752209 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-combined-ca-bundle\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.752494 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dda5aa6-ee95-4b75-9204-b014aba202ae-logs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.752569 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-internal-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.857779 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-public-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.857873 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-combined-ca-bundle\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.858051 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dda5aa6-ee95-4b75-9204-b014aba202ae-logs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.858112 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-internal-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.858161 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-config-data\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.858246 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w27cc\" (UniqueName: \"kubernetes.io/projected/7dda5aa6-ee95-4b75-9204-b014aba202ae-kube-api-access-w27cc\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.858317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-scripts\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.859896 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dda5aa6-ee95-4b75-9204-b014aba202ae-logs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.867851 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-internal-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.869653 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-scripts\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.870416 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-public-tls-certs\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.871702 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-config-data\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.875084 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dda5aa6-ee95-4b75-9204-b014aba202ae-combined-ca-bundle\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.876660 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.879958 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27cc\" (UniqueName: \"kubernetes.io/projected/7dda5aa6-ee95-4b75-9204-b014aba202ae-kube-api-access-w27cc\") pod \"placement-5f6bb888-s77j2\" (UID: \"7dda5aa6-ee95-4b75-9204-b014aba202ae\") " pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.881452 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.897919 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:26:50 crc kubenswrapper[4912]: I0318 13:26:50.981410 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:51 crc kubenswrapper[4912]: I0318 13:26:51.398922 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79d7d9b7f7-jbmpn"] Mar 18 13:26:51 crc kubenswrapper[4912]: I0318 13:26:51.613837 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:26:51 crc kubenswrapper[4912]: I0318 13:26:51.750402 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79d7d9b7f7-jbmpn" event={"ID":"5c102a6b-029b-47f9-bdd5-66fb03606564","Type":"ContainerStarted","Data":"a15b86b1b1a550575838995dd8fe19169d0af5c18c0c6dd618209500b7b1bd1f"} Mar 18 13:26:51 crc kubenswrapper[4912]: I0318 13:26:51.756372 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerStarted","Data":"d41b09baed569265b952e2ad8f30c824550c33a4e08daf1e2b87fa2b61dba9fa"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.108119 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f6bb888-s77j2"] Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.796536 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerStarted","Data":"06215be31d7302bcf66e43bca95aede8926ab3093133d6005fafd86c3e8ebf11"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.808345 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerStarted","Data":"652095645d51d0cc2395a38caac6d84a53965396990722058dc23ae7de41adc8"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.828487 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79d7d9b7f7-jbmpn" event={"ID":"5c102a6b-029b-47f9-bdd5-66fb03606564","Type":"ContainerStarted","Data":"b360f0557c57ae2b9c1fbb3622dcb8100909312a149feaeb0228da9bb172ca54"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.830245 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.871441 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rqm8k" podStartSLOduration=10.769005698 podStartE2EDuration="22.87140365s" podCreationTimestamp="2026-03-18 13:26:30 +0000 UTC" firstStartedPulling="2026-03-18 13:26:39.264160688 +0000 UTC m=+1447.723588113" lastFinishedPulling="2026-03-18 13:26:51.36655864 +0000 UTC m=+1459.825986065" observedRunningTime="2026-03-18 13:26:52.822357154 +0000 UTC m=+1461.281784589" watchObservedRunningTime="2026-03-18 13:26:52.87140365 +0000 UTC m=+1461.330831105" Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.895520 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerStarted","Data":"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.900490 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d7x2r" event={"ID":"e7732fd2-b813-47e5-8f23-823a3037df09","Type":"ContainerStarted","Data":"010d6616847f86f6d8923fa483a53f6b366513a314ef311a8d3c6d68eb430c5b"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.909860 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79d7d9b7f7-jbmpn" podStartSLOduration=2.909825571 podStartE2EDuration="2.909825571s" podCreationTimestamp="2026-03-18 13:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:52.875270034 +0000 UTC m=+1461.334697459" watchObservedRunningTime="2026-03-18 13:26:52.909825571 +0000 UTC m=+1461.369252996" Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.943720 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvmbk" podStartSLOduration=12.818667671 podStartE2EDuration="24.94368688s" podCreationTimestamp="2026-03-18 13:26:28 +0000 UTC" firstStartedPulling="2026-03-18 13:26:39.264161908 +0000 UTC m=+1447.723589333" lastFinishedPulling="2026-03-18 13:26:51.389181107 +0000 UTC m=+1459.848608542" observedRunningTime="2026-03-18 13:26:52.920693853 +0000 UTC m=+1461.380121278" watchObservedRunningTime="2026-03-18 13:26:52.94368688 +0000 UTC m=+1461.403114315" Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.948593 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f6bb888-s77j2" event={"ID":"7dda5aa6-ee95-4b75-9204-b014aba202ae","Type":"ContainerStarted","Data":"8a7d910a9973cd103db9f216682aad094e378d22e572ca9af3c31de316730d23"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.948653 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f6bb888-s77j2" event={"ID":"7dda5aa6-ee95-4b75-9204-b014aba202ae","Type":"ContainerStarted","Data":"5008b3b3114bc78339cf14e9248dad37e66c1ebb9dbbe25a34522b28babd13f3"} Mar 18 13:26:52 crc kubenswrapper[4912]: I0318 13:26:52.957835 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-d7x2r" podStartSLOduration=5.50351023 podStartE2EDuration="55.957812139s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="2026-03-18 13:26:00.814669882 +0000 UTC m=+1409.274097307" lastFinishedPulling="2026-03-18 13:26:51.268971801 +0000 UTC m=+1459.728399216" observedRunningTime="2026-03-18 13:26:52.948736206 +0000 UTC m=+1461.408163631" watchObservedRunningTime="2026-03-18 13:26:52.957812139 +0000 UTC m=+1461.417239554" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.010403 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f6bb888-s77j2" event={"ID":"7dda5aa6-ee95-4b75-9204-b014aba202ae","Type":"ContainerStarted","Data":"ffb6a617628f4411ccf32973912d90f4f9fcf85643c86efc612efac60966585d"} Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.013241 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.013355 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.057596 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerStarted","Data":"ae02311b5b822593fc668323470d6c25e6034495d3108df0352f6b04f8e73911"} Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.058685 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.058767 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.077567 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f6bb888-s77j2" podStartSLOduration=4.077537283 podStartE2EDuration="4.077537283s" podCreationTimestamp="2026-03-18 13:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:54.047333783 +0000 UTC m=+1462.506761218" watchObservedRunningTime="2026-03-18 13:26:54.077537283 +0000 UTC m=+1462.536964708" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.089155 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fbf2l" event={"ID":"147c4d2b-19d3-48da-9364-c527a1cacc3c","Type":"ContainerStarted","Data":"7d8f21a6020213a9a299dca6955d632607150dff3cf297d54334fdd46836370e"} Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.103975 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-545f7ccb8-vvt42" podStartSLOduration=4.103936762 podStartE2EDuration="4.103936762s" podCreationTimestamp="2026-03-18 13:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:26:54.091139068 +0000 UTC m=+1462.550566503" watchObservedRunningTime="2026-03-18 13:26:54.103936762 +0000 UTC m=+1462.563364177" Mar 18 13:26:54 crc kubenswrapper[4912]: I0318 13:26:54.138878 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fbf2l" podStartSLOduration=5.708492842 podStartE2EDuration="57.138846469s" podCreationTimestamp="2026-03-18 13:25:57 +0000 UTC" firstStartedPulling="2026-03-18 13:25:59.872660788 +0000 UTC m=+1408.332088213" lastFinishedPulling="2026-03-18 13:26:51.303014415 +0000 UTC m=+1459.762441840" observedRunningTime="2026-03-18 13:26:54.122394147 +0000 UTC m=+1462.581821572" watchObservedRunningTime="2026-03-18 13:26:54.138846469 +0000 UTC m=+1462.598273894" Mar 18 13:26:55 crc kubenswrapper[4912]: I0318 13:26:55.189455 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:26:56 crc kubenswrapper[4912]: I0318 13:26:56.911315 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:26:57 crc kubenswrapper[4912]: I0318 13:26:57.003093 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:26:57 crc kubenswrapper[4912]: I0318 13:26:57.003417 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" containerID="cri-o://0a64d87e0d6fe16eaab283a6cba6329161f876089c59b7d252b8ad343c70a8de" gracePeriod=10 Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.191888 4912 generic.go:334] "Generic (PLEG): container finished" podID="dadc5395-e931-4293-b037-929db9a9bd99" containerID="0a64d87e0d6fe16eaab283a6cba6329161f876089c59b7d252b8ad343c70a8de" exitCode=0 Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.192098 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" event={"ID":"dadc5395-e931-4293-b037-929db9a9bd99","Type":"ContainerDied","Data":"0a64d87e0d6fe16eaab283a6cba6329161f876089c59b7d252b8ad343c70a8de"} Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.200195 4912 generic.go:334] "Generic (PLEG): container finished" podID="e7732fd2-b813-47e5-8f23-823a3037df09" containerID="010d6616847f86f6d8923fa483a53f6b366513a314ef311a8d3c6d68eb430c5b" exitCode=0 Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.200299 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d7x2r" event={"ID":"e7732fd2-b813-47e5-8f23-823a3037df09","Type":"ContainerDied","Data":"010d6616847f86f6d8923fa483a53f6b366513a314ef311a8d3c6d68eb430c5b"} Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.505967 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.506551 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:26:58 crc kubenswrapper[4912]: I0318 13:26:58.981929 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: connect: connection refused" Mar 18 13:26:59 crc kubenswrapper[4912]: I0318 13:26:59.570223 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rvmbk" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:26:59 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:26:59 crc kubenswrapper[4912]: > Mar 18 13:27:00 crc kubenswrapper[4912]: I0318 13:27:00.867162 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:27:00 crc kubenswrapper[4912]: I0318 13:27:00.867624 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:27:01 crc kubenswrapper[4912]: I0318 13:27:01.943852 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:01 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:01 crc kubenswrapper[4912]: > Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.000663 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.039915 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvpg\" (UniqueName: \"kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg\") pod \"e7732fd2-b813-47e5-8f23-823a3037df09\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.040024 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle\") pod \"e7732fd2-b813-47e5-8f23-823a3037df09\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.041504 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data\") pod \"e7732fd2-b813-47e5-8f23-823a3037df09\" (UID: \"e7732fd2-b813-47e5-8f23-823a3037df09\") " Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.050482 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e7732fd2-b813-47e5-8f23-823a3037df09" (UID: "e7732fd2-b813-47e5-8f23-823a3037df09"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.051326 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg" (OuterVolumeSpecName: "kube-api-access-cwvpg") pod "e7732fd2-b813-47e5-8f23-823a3037df09" (UID: "e7732fd2-b813-47e5-8f23-823a3037df09"). InnerVolumeSpecName "kube-api-access-cwvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.078594 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7732fd2-b813-47e5-8f23-823a3037df09" (UID: "e7732fd2-b813-47e5-8f23-823a3037df09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.147199 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvpg\" (UniqueName: \"kubernetes.io/projected/e7732fd2-b813-47e5-8f23-823a3037df09-kube-api-access-cwvpg\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.147927 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.147944 4912 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7732fd2-b813-47e5-8f23-823a3037df09-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.264921 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d7x2r" event={"ID":"e7732fd2-b813-47e5-8f23-823a3037df09","Type":"ContainerDied","Data":"5cb5e37974650a5a079286c131856bca16e42886df88e50d99080ae00dd0874e"} Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.265098 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cb5e37974650a5a079286c131856bca16e42886df88e50d99080ae00dd0874e" Mar 18 13:27:02 crc kubenswrapper[4912]: I0318 13:27:02.265201 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d7x2r" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.473171 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-79f58cdfc5-fz7qx"] Mar 18 13:27:03 crc kubenswrapper[4912]: E0318 13:27:03.474554 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" containerName="barbican-db-sync" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.474578 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" containerName="barbican-db-sync" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.474881 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" containerName="barbican-db-sync" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.489559 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.500630 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xlmlb" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.500941 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.501226 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.528012 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-79f58cdfc5-fz7qx"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.577121 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7965f976fd-25ct8"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.579987 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.588120 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.590374 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85dfe685-650b-44a6-b164-137cae893166-logs\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.590482 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lms4l\" (UniqueName: \"kubernetes.io/projected/85dfe685-650b-44a6-b164-137cae893166-kube-api-access-lms4l\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.590541 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.590561 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-combined-ca-bundle\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.590602 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data-custom\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.643974 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.648488 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.684400 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7965f976fd-25ct8"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693736 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85dfe685-650b-44a6-b164-137cae893166-logs\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693809 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94867b1a-6891-4d44-b968-0b18a8b30085-logs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693871 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9jhs\" (UniqueName: \"kubernetes.io/projected/94867b1a-6891-4d44-b968-0b18a8b30085-kube-api-access-m9jhs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693894 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693920 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lms4l\" (UniqueName: \"kubernetes.io/projected/85dfe685-650b-44a6-b164-137cae893166-kube-api-access-lms4l\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693940 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-combined-ca-bundle\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.693994 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.694025 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-combined-ca-bundle\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.694074 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data-custom\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.694110 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data-custom\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.694289 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85dfe685-650b-44a6-b164-137cae893166-logs\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.719661 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data-custom\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.721598 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-combined-ca-bundle\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.725758 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85dfe685-650b-44a6-b164-137cae893166-config-data\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.730204 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.785995 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lms4l\" (UniqueName: \"kubernetes.io/projected/85dfe685-650b-44a6-b164-137cae893166-kube-api-access-lms4l\") pod \"barbican-worker-79f58cdfc5-fz7qx\" (UID: \"85dfe685-650b-44a6-b164-137cae893166\") " pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796355 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9jhs\" (UniqueName: \"kubernetes.io/projected/94867b1a-6891-4d44-b968-0b18a8b30085-kube-api-access-m9jhs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796435 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796467 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-combined-ca-bundle\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796536 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht58n\" (UniqueName: \"kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796582 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data-custom\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796602 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796683 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796722 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796764 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796796 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.796818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94867b1a-6891-4d44-b968-0b18a8b30085-logs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.814131 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94867b1a-6891-4d44-b968-0b18a8b30085-logs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.835105 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.837350 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.842312 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.843419 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-combined-ca-bundle\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.854054 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data-custom\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.855943 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.861168 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94867b1a-6891-4d44-b968-0b18a8b30085-config-data\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.861948 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9jhs\" (UniqueName: \"kubernetes.io/projected/94867b1a-6891-4d44-b968-0b18a8b30085-kube-api-access-m9jhs\") pod \"barbican-keystone-listener-7965f976fd-25ct8\" (UID: \"94867b1a-6891-4d44-b968-0b18a8b30085\") " pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.884489 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.898926 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht58n\" (UniqueName: \"kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.899331 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.899426 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.899465 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.899504 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.899842 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.907393 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.909220 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.910790 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.911122 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.914571 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.930345 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht58n\" (UniqueName: \"kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n\") pod \"dnsmasq-dns-848cf88cfc-xkq2h\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:03 crc kubenswrapper[4912]: I0318 13:27:03.957795 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.002632 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.002715 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.002779 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4vc\" (UniqueName: \"kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.002853 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.002922 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.015865 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.087719 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.114468 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.114715 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.114796 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.114906 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4vc\" (UniqueName: \"kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.115088 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.120542 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.135973 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.145354 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.176885 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.204700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4vc\" (UniqueName: \"kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc\") pod \"barbican-api-7689cffd58-6dzrs\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.216846 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.228383 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.229376 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.229461 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.229491 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrghc\" (UniqueName: \"kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.229555 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb\") pod \"dadc5395-e931-4293-b037-929db9a9bd99\" (UID: \"dadc5395-e931-4293-b037-929db9a9bd99\") " Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.270377 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc" (OuterVolumeSpecName: "kube-api-access-nrghc") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "kube-api-access-nrghc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.363180 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrghc\" (UniqueName: \"kubernetes.io/projected/dadc5395-e931-4293-b037-929db9a9bd99-kube-api-access-nrghc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.378175 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.380374 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" event={"ID":"dadc5395-e931-4293-b037-929db9a9bd99","Type":"ContainerDied","Data":"354b66a06e01eac2e53c80654ea772516def04f904e368cfa9c6dc40a63c9dc6"} Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.380443 4912 scope.go:117] "RemoveContainer" containerID="0a64d87e0d6fe16eaab283a6cba6329161f876089c59b7d252b8ad343c70a8de" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.392570 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config" (OuterVolumeSpecName: "config") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.438553 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.449114 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.449916 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.461766 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.470039 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.470083 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.470095 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.470105 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.485729 4912 scope.go:117] "RemoveContainer" containerID="2f277b3c369b730545258bccc92a22af36f8ec2036848a4f82c2603740b767ac" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.487639 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dadc5395-e931-4293-b037-929db9a9bd99" (UID: "dadc5395-e931-4293-b037-929db9a9bd99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.572049 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dadc5395-e931-4293-b037-929db9a9bd99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.870369 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.893285 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gp7xf"] Mar 18 13:27:04 crc kubenswrapper[4912]: I0318 13:27:04.981704 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7965f976fd-25ct8"] Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.088529 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-79f58cdfc5-fz7qx"] Mar 18 13:27:05 crc kubenswrapper[4912]: W0318 13:27:05.157268 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85dfe685_650b_44a6_b164_137cae893166.slice/crio-75b74471d060c7b74352e3a27509b8b80f377b0aee817ef95d2710a23ab11a4e WatchSource:0}: Error finding container 75b74471d060c7b74352e3a27509b8b80f377b0aee817ef95d2710a23ab11a4e: Status 404 returned error can't find the container with id 75b74471d060c7b74352e3a27509b8b80f377b0aee817ef95d2710a23ab11a4e Mar 18 13:27:05 crc kubenswrapper[4912]: E0318 13:27:05.275794 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.296044 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.431928 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" event={"ID":"94867b1a-6891-4d44-b968-0b18a8b30085","Type":"ContainerStarted","Data":"a513a69372438fc67953e816648d42765f1a4f3291068d50fcdddf70a0562c81"} Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.446452 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerStarted","Data":"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870"} Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.447074 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="sg-core" containerID="cri-o://b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5" gracePeriod=30 Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.447596 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.448234 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="proxy-httpd" containerID="cri-o://e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870" gracePeriod=30 Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.454618 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" event={"ID":"eb79a9e1-8101-4bd1-8c03-9d4d527b9203","Type":"ContainerStarted","Data":"e3ec631be4f881e53d3bbbb8bfbfeec83b0c665f824bbd08d760e9ac7bf99f06"} Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.467648 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" event={"ID":"85dfe685-650b-44a6-b164-137cae893166","Type":"ContainerStarted","Data":"75b74471d060c7b74352e3a27509b8b80f377b0aee817ef95d2710a23ab11a4e"} Mar 18 13:27:05 crc kubenswrapper[4912]: I0318 13:27:05.624970 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.250921 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadc5395-e931-4293-b037-929db9a9bd99" path="/var/lib/kubelet/pods/dadc5395-e931-4293-b037-929db9a9bd99/volumes" Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.509785 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerStarted","Data":"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b"} Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.509863 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerStarted","Data":"9a7f5bd90d39b68a0dd6c7127e1a9288ca9d5bec9197c8d1f9be43af1113a055"} Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.520983 4912 generic.go:334] "Generic (PLEG): container finished" podID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerID="b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5" exitCode=2 Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.521180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerDied","Data":"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5"} Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.524418 4912 generic.go:334] "Generic (PLEG): container finished" podID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerID="6a4231ca2e6e9e35e7bb3f0591fe7a3d0c140182f5de54c31ea831814c6efbe9" exitCode=0 Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.524485 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" event={"ID":"eb79a9e1-8101-4bd1-8c03-9d4d527b9203","Type":"ContainerDied","Data":"6a4231ca2e6e9e35e7bb3f0591fe7a3d0c140182f5de54c31ea831814c6efbe9"} Mar 18 13:27:06 crc kubenswrapper[4912]: I0318 13:27:06.939682 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.557873 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.559816 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69b7f8c44f-4zs4t" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-api" containerID="cri-o://19564a37a9b24261d7c9934eb493448b57a6b3b15e9b7b8a2666172c28b5565d" gracePeriod=30 Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.560555 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69b7f8c44f-4zs4t" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-httpd" containerID="cri-o://9ce013bedffcdb061ece7ba64de1d7a0ecab72853f02f5494a44e429e88dcb81" gracePeriod=30 Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.569951 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerStarted","Data":"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b"} Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.570104 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.571486 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.602801 4912 generic.go:334] "Generic (PLEG): container finished" podID="147c4d2b-19d3-48da-9364-c527a1cacc3c" containerID="7d8f21a6020213a9a299dca6955d632607150dff3cf297d54334fdd46836370e" exitCode=0 Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.603172 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fbf2l" event={"ID":"147c4d2b-19d3-48da-9364-c527a1cacc3c","Type":"ContainerDied","Data":"7d8f21a6020213a9a299dca6955d632607150dff3cf297d54334fdd46836370e"} Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.611856 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.643574 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d86cc5c8f-fqc82"] Mar 18 13:27:07 crc kubenswrapper[4912]: E0318 13:27:07.644658 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.644781 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" Mar 18 13:27:07 crc kubenswrapper[4912]: E0318 13:27:07.644919 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="init" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.645030 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="init" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.647018 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.650679 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.689319 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7689cffd58-6dzrs" podStartSLOduration=4.689285136 podStartE2EDuration="4.689285136s" podCreationTimestamp="2026-03-18 13:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:07.678716412 +0000 UTC m=+1476.138143847" watchObservedRunningTime="2026-03-18 13:27:07.689285136 +0000 UTC m=+1476.148712551" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.690215 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d86cc5c8f-fqc82"] Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.759610 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxfxr\" (UniqueName: \"kubernetes.io/projected/6f2b7b0b-4b03-441d-9c94-606e57f8e710-kube-api-access-rxfxr\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.763552 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-combined-ca-bundle\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.763607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-public-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.763863 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-internal-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.763931 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-ovndb-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.764128 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-httpd-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.764172 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868027 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-httpd-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868109 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868240 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxfxr\" (UniqueName: \"kubernetes.io/projected/6f2b7b0b-4b03-441d-9c94-606e57f8e710-kube-api-access-rxfxr\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868285 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-combined-ca-bundle\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868306 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-public-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868368 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-internal-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.868396 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-ovndb-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.882436 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-public-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.882941 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-httpd-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.892998 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-internal-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.893029 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-combined-ca-bundle\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.895078 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-ovndb-tls-certs\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.903552 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f2b7b0b-4b03-441d-9c94-606e57f8e710-config\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:07 crc kubenswrapper[4912]: I0318 13:27:07.927646 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxfxr\" (UniqueName: \"kubernetes.io/projected/6f2b7b0b-4b03-441d-9c94-606e57f8e710-kube-api-access-rxfxr\") pod \"neutron-7d86cc5c8f-fqc82\" (UID: \"6f2b7b0b-4b03-441d-9c94-606e57f8e710\") " pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:08 crc kubenswrapper[4912]: I0318 13:27:08.033899 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:08 crc kubenswrapper[4912]: I0318 13:27:08.619446 4912 generic.go:334] "Generic (PLEG): container finished" podID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerID="9ce013bedffcdb061ece7ba64de1d7a0ecab72853f02f5494a44e429e88dcb81" exitCode=0 Mar 18 13:27:08 crc kubenswrapper[4912]: I0318 13:27:08.619544 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerDied","Data":"9ce013bedffcdb061ece7ba64de1d7a0ecab72853f02f5494a44e429e88dcb81"} Mar 18 13:27:08 crc kubenswrapper[4912]: I0318 13:27:08.982142 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-gp7xf" podUID="dadc5395-e931-4293-b037-929db9a9bd99" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: i/o timeout" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.015142 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59f85f449d-2mslp"] Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.021592 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.027493 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.027732 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.043397 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59f85f449d-2mslp"] Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.139135 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.142469 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147692 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147759 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5262960-4228-43dd-a5d9-0fcdfe8111c3-logs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147833 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data-custom\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147877 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147923 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-internal-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.147983 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-public-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.148023 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxtnv\" (UniqueName: \"kubernetes.io/projected/c5262960-4228-43dd-a5d9-0fcdfe8111c3-kube-api-access-vxtnv\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.148092 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-combined-ca-bundle\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.148123 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.148180 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rq46\" (UniqueName: \"kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.173176 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257396 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-internal-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257588 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-public-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257666 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxtnv\" (UniqueName: \"kubernetes.io/projected/c5262960-4228-43dd-a5d9-0fcdfe8111c3-kube-api-access-vxtnv\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257765 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-combined-ca-bundle\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257800 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.257903 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rq46\" (UniqueName: \"kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.258093 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.258127 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5262960-4228-43dd-a5d9-0fcdfe8111c3-logs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.258207 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data-custom\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.258291 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.259408 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.267396 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.267751 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5262960-4228-43dd-a5d9-0fcdfe8111c3-logs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.294400 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-combined-ca-bundle\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.295286 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-public-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.301625 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-internal-tls-certs\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.321663 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rq46\" (UniqueName: \"kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46\") pod \"community-operators-4p6pn\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.331755 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data-custom\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.332953 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5262960-4228-43dd-a5d9-0fcdfe8111c3-config-data\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.333935 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxtnv\" (UniqueName: \"kubernetes.io/projected/c5262960-4228-43dd-a5d9-0fcdfe8111c3-kube-api-access-vxtnv\") pod \"barbican-api-59f85f449d-2mslp\" (UID: \"c5262960-4228-43dd-a5d9-0fcdfe8111c3\") " pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.409037 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.479435 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.504444 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-69b7f8c44f-4zs4t" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.202:9696/\": dial tcp 10.217.0.202:9696: connect: connection refused" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.538385 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.635620 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rvmbk" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:09 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:09 crc kubenswrapper[4912]: > Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.692769 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.693216 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.716343 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts" (OuterVolumeSpecName: "scripts") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.724143 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.724563 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.724678 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glpg5\" (UniqueName: \"kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.724809 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data\") pod \"147c4d2b-19d3-48da-9364-c527a1cacc3c\" (UID: \"147c4d2b-19d3-48da-9364-c527a1cacc3c\") " Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.760229 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.801728 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5" (OuterVolumeSpecName: "kube-api-access-glpg5") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "kube-api-access-glpg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.897668 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.915118 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glpg5\" (UniqueName: \"kubernetes.io/projected/147c4d2b-19d3-48da-9364-c527a1cacc3c-kube-api-access-glpg5\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.915177 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.915208 4912 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/147c4d2b-19d3-48da-9364-c527a1cacc3c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.924900 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" event={"ID":"eb79a9e1-8101-4bd1-8c03-9d4d527b9203","Type":"ContainerStarted","Data":"21a45e0f16c8968d137adcb483cab24e3f4e5bc5c6e8eadbb00ebe9299f7f022"} Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.925472 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.969631 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fbf2l" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.983274 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fbf2l" event={"ID":"147c4d2b-19d3-48da-9364-c527a1cacc3c","Type":"ContainerDied","Data":"81488a03a7d89c2f077207011325153300696101039858ec7ff6856643ad1127"} Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.983361 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81488a03a7d89c2f077207011325153300696101039858ec7ff6856643ad1127" Mar 18 13:27:09 crc kubenswrapper[4912]: I0318 13:27:09.985668 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.020859 4912 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.021438 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.048811 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" podStartSLOduration=7.048788716 podStartE2EDuration="7.048788716s" podCreationTimestamp="2026-03-18 13:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:10.04859898 +0000 UTC m=+1478.508026425" watchObservedRunningTime="2026-03-18 13:27:10.048788716 +0000 UTC m=+1478.508216141" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.073990 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data" (OuterVolumeSpecName: "config-data") pod "147c4d2b-19d3-48da-9364-c527a1cacc3c" (UID: "147c4d2b-19d3-48da-9364-c527a1cacc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.145493 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147c4d2b-19d3-48da-9364-c527a1cacc3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.317811 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:10 crc kubenswrapper[4912]: E0318 13:27:10.350403 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" containerName="cinder-db-sync" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.350467 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" containerName="cinder-db-sync" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.351137 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" containerName="cinder-db-sync" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.352823 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.352980 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362412 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd2xl\" (UniqueName: \"kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362481 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362537 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362696 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.362784 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: W0318 13:27:10.369754 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f2b7b0b_4b03_441d_9c94_606e57f8e710.slice/crio-a948a991a02b10616601aaea9bf021b1904b1a553745528bbda3d8a1da8dfcd9 WatchSource:0}: Error finding container a948a991a02b10616601aaea9bf021b1904b1a553745528bbda3d8a1da8dfcd9: Status 404 returned error can't find the container with id a948a991a02b10616601aaea9bf021b1904b1a553745528bbda3d8a1da8dfcd9 Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.379638 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.379977 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xv548" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.412212 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d86cc5c8f-fqc82"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.417648 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.417981 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.454440 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469030 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd2xl\" (UniqueName: \"kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469099 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469143 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469230 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469265 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.469294 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.471798 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.480456 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.488248 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.489660 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.491646 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.511263 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd2xl\" (UniqueName: \"kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.515540 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.516019 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.575333 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzhqr\" (UniqueName: \"kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.575398 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.593162 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.594815 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.594992 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.595304 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.595490 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699265 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699304 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699386 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699438 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.699560 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzhqr\" (UniqueName: \"kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.707067 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.707727 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.707807 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.708383 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.710296 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.721109 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.725441 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.738905 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.743352 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzhqr\" (UniqueName: \"kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr\") pod \"dnsmasq-dns-6578955fd5-7z4mf\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.743818 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804346 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rt6j\" (UniqueName: \"kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804428 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804467 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804501 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804560 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804598 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.804652 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.823458 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.868525 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.919724 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.919874 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.919959 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rt6j\" (UniqueName: \"kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.920043 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.921962 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.924956 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.920131 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.935809 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.936037 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.949873 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.954428 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:10 crc kubenswrapper[4912]: I0318 13:27:10.955408 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.021298 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59f85f449d-2mslp"] Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.022785 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.032205 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rt6j\" (UniqueName: \"kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j\") pod \"cinder-api-0\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " pod="openstack/cinder-api-0" Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.075912 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" event={"ID":"94867b1a-6891-4d44-b968-0b18a8b30085","Type":"ContainerStarted","Data":"6f9d26a073f7e789311730cf5375493555e0dd40639a1953e4dd478e8800236b"} Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.089967 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d86cc5c8f-fqc82" event={"ID":"6f2b7b0b-4b03-441d-9c94-606e57f8e710","Type":"ContainerStarted","Data":"a948a991a02b10616601aaea9bf021b1904b1a553745528bbda3d8a1da8dfcd9"} Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.093814 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" event={"ID":"85dfe685-650b-44a6-b164-137cae893166","Type":"ContainerStarted","Data":"634c6306e861e209e7259dc9cb21f04113b613b53055b89ee32834e4c8c8e37f"} Mar 18 13:27:11 crc kubenswrapper[4912]: W0318 13:27:11.125685 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5262960_4228_43dd_a5d9_0fcdfe8111c3.slice/crio-b93aae25ab02bcbdc45d01491a62ae98f2f0b80a312bc99d36f3549635443fc9 WatchSource:0}: Error finding container b93aae25ab02bcbdc45d01491a62ae98f2f0b80a312bc99d36f3549635443fc9: Status 404 returned error can't find the container with id b93aae25ab02bcbdc45d01491a62ae98f2f0b80a312bc99d36f3549635443fc9 Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.192855 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:11 crc kubenswrapper[4912]: I0318 13:27:11.222545 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.009122 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.133293 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:12 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:12 crc kubenswrapper[4912]: > Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.166454 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f85f449d-2mslp" event={"ID":"c5262960-4228-43dd-a5d9-0fcdfe8111c3","Type":"ContainerStarted","Data":"b93aae25ab02bcbdc45d01491a62ae98f2f0b80a312bc99d36f3549635443fc9"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.187896 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d86cc5c8f-fqc82" event={"ID":"6f2b7b0b-4b03-441d-9c94-606e57f8e710","Type":"ContainerStarted","Data":"5d8628fc2f04396044ed94e70c0d67db069e5d0bb10a1a1031df9256d20d1edf"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.316558 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" event={"ID":"85dfe685-650b-44a6-b164-137cae893166","Type":"ContainerStarted","Data":"8911059b3fda188e05d0c9175c59f5725d7f49a3f4fa4b199d4d485ad5087e5b"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.319129 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" event={"ID":"94867b1a-6891-4d44-b968-0b18a8b30085","Type":"ContainerStarted","Data":"cb68aca09136885a6169380b2d3e169617e66b289eee8f21c68067ca43485d45"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.381732 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerStarted","Data":"6d5b512aaf5c32a3b9e7e2fdbc51fc53b7442e959a1698ddeb163d3ac5d6b381"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.452693 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerStarted","Data":"a0d2cdcb2f796d7996edb4ff676b64f8f42a029542fe24cea41b68690c3fbfcb"} Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.462212 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="dnsmasq-dns" containerID="cri-o://21a45e0f16c8968d137adcb483cab24e3f4e5bc5c6e8eadbb00ebe9299f7f022" gracePeriod=10 Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.537267 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.897168 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.913839 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7965f976fd-25ct8" podStartSLOduration=6.111081279 podStartE2EDuration="9.913792235s" podCreationTimestamp="2026-03-18 13:27:03 +0000 UTC" firstStartedPulling="2026-03-18 13:27:04.978816627 +0000 UTC m=+1473.438244052" lastFinishedPulling="2026-03-18 13:27:08.781527583 +0000 UTC m=+1477.240955008" observedRunningTime="2026-03-18 13:27:12.725791709 +0000 UTC m=+1481.185219134" watchObservedRunningTime="2026-03-18 13:27:12.913792235 +0000 UTC m=+1481.373219680" Mar 18 13:27:12 crc kubenswrapper[4912]: I0318 13:27:12.925412 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-79f58cdfc5-fz7qx" podStartSLOduration=6.320445889 podStartE2EDuration="9.925377116s" podCreationTimestamp="2026-03-18 13:27:03 +0000 UTC" firstStartedPulling="2026-03-18 13:27:05.171996222 +0000 UTC m=+1473.631423647" lastFinishedPulling="2026-03-18 13:27:08.776927449 +0000 UTC m=+1477.236354874" observedRunningTime="2026-03-18 13:27:12.774197448 +0000 UTC m=+1481.233624873" watchObservedRunningTime="2026-03-18 13:27:12.925377116 +0000 UTC m=+1481.384804541" Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.592579 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerStarted","Data":"019db5d1732d6f75d86c430ec9d4f2e82043f07ea3c7af7a752f0ae61df9db2c"} Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.681110 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d86cc5c8f-fqc82" event={"ID":"6f2b7b0b-4b03-441d-9c94-606e57f8e710","Type":"ContainerStarted","Data":"735aa8c88b804f0e39a5df266fb66401121c24d4b57a51052c858e555d2ceff1"} Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.682972 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.705532 4912 generic.go:334] "Generic (PLEG): container finished" podID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerID="21a45e0f16c8968d137adcb483cab24e3f4e5bc5c6e8eadbb00ebe9299f7f022" exitCode=0 Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.705612 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" event={"ID":"eb79a9e1-8101-4bd1-8c03-9d4d527b9203","Type":"ContainerDied","Data":"21a45e0f16c8968d137adcb483cab24e3f4e5bc5c6e8eadbb00ebe9299f7f022"} Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.761713 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d86cc5c8f-fqc82" podStartSLOduration=6.761685742 podStartE2EDuration="6.761685742s" podCreationTimestamp="2026-03-18 13:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:13.728623575 +0000 UTC m=+1482.188051020" watchObservedRunningTime="2026-03-18 13:27:13.761685742 +0000 UTC m=+1482.221113167" Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.780997 4912 generic.go:334] "Generic (PLEG): container finished" podID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerID="13c7bc06c22b4583634716708c3c968c5262308701cb89a628ba2c88628f21f4" exitCode=0 Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.783944 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerDied","Data":"13c7bc06c22b4583634716708c3c968c5262308701cb89a628ba2c88628f21f4"} Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.798244 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f85f449d-2mslp" event={"ID":"c5262960-4228-43dd-a5d9-0fcdfe8111c3","Type":"ContainerStarted","Data":"15689692e44267f154638da8628a316a31c242877d54ec2ab555f90e2a7ab524"} Mar 18 13:27:13 crc kubenswrapper[4912]: I0318 13:27:13.819834 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" event={"ID":"1331546f-e949-4d01-97fa-48a28a165bec","Type":"ContainerStarted","Data":"544dd678336f5493c183f70672d37606d21ed7fcb34a56bc46d75f3382e56fcb"} Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.515835 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650400 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650790 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650868 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650910 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650945 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht58n\" (UniqueName: \"kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.650981 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb\") pod \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\" (UID: \"eb79a9e1-8101-4bd1-8c03-9d4d527b9203\") " Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.660176 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n" (OuterVolumeSpecName: "kube-api-access-ht58n") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "kube-api-access-ht58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.761636 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht58n\" (UniqueName: \"kubernetes.io/projected/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-kube-api-access-ht58n\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.792331 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.861061 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config" (OuterVolumeSpecName: "config") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.871613 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.871644 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.938604 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.948604 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.949747 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb79a9e1-8101-4bd1-8c03-9d4d527b9203" (UID: "eb79a9e1-8101-4bd1-8c03-9d4d527b9203"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.974639 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.974681 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.974695 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb79a9e1-8101-4bd1-8c03-9d4d527b9203-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:14 crc kubenswrapper[4912]: I0318 13:27:14.999863 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.012372 4912 generic.go:334] "Generic (PLEG): container finished" podID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerID="19564a37a9b24261d7c9934eb493448b57a6b3b15e9b7b8a2666172c28b5565d" exitCode=0 Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.012505 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerDied","Data":"19564a37a9b24261d7c9934eb493448b57a6b3b15e9b7b8a2666172c28b5565d"} Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.040984 4912 generic.go:334] "Generic (PLEG): container finished" podID="1331546f-e949-4d01-97fa-48a28a165bec" containerID="d7c6ba00564984f8a06283f3c97631907fde972d87d8ec35fea9d99f60b41ab9" exitCode=0 Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.041193 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" event={"ID":"1331546f-e949-4d01-97fa-48a28a165bec","Type":"ContainerDied","Data":"d7c6ba00564984f8a06283f3c97631907fde972d87d8ec35fea9d99f60b41ab9"} Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.045706 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" event={"ID":"eb79a9e1-8101-4bd1-8c03-9d4d527b9203","Type":"ContainerDied","Data":"e3ec631be4f881e53d3bbbb8bfbfeec83b0c665f824bbd08d760e9ac7bf99f06"} Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.045758 4912 scope.go:117] "RemoveContainer" containerID="21a45e0f16c8968d137adcb483cab24e3f4e5bc5c6e8eadbb00ebe9299f7f022" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.045894 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.067441 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f85f449d-2mslp" event={"ID":"c5262960-4228-43dd-a5d9-0fcdfe8111c3","Type":"ContainerStarted","Data":"8dc0e802d7ff5e35fe6d2cc6403b60e1a67a3a337ffd0157cf5dfd119e9fc9a2"} Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.067737 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.067851 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.104681 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59f85f449d-2mslp" podStartSLOduration=7.104655668 podStartE2EDuration="7.104655668s" podCreationTimestamp="2026-03-18 13:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:15.09652902 +0000 UTC m=+1483.555956455" watchObservedRunningTime="2026-03-18 13:27:15.104655668 +0000 UTC m=+1483.564083093" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.156820 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.207537 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xkq2h"] Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.276411 4912 scope.go:117] "RemoveContainer" containerID="6a4231ca2e6e9e35e7bb3f0591fe7a3d0c140182f5de54c31ea831814c6efbe9" Mar 18 13:27:15 crc kubenswrapper[4912]: I0318 13:27:15.491235 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.162699 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerStarted","Data":"0b0d640751d2bc787e4ace3b0fdf3dfd60f52d033fee6c303d17b93c5acdfa11"} Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.178453 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" event={"ID":"1331546f-e949-4d01-97fa-48a28a165bec","Type":"ContainerStarted","Data":"a162166d3edc118ef7452a18bc245f968a9ef84696a13788ba367fa2487f1f5c"} Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.178850 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.196023 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerStarted","Data":"614faa5191b340f6797d0038f10cb9d79eae2eb421c3f233a8a40dabbb8273d9"} Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.271422 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" podStartSLOduration=6.271396734 podStartE2EDuration="6.271396734s" podCreationTimestamp="2026-03-18 13:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:16.271286031 +0000 UTC m=+1484.730713476" watchObservedRunningTime="2026-03-18 13:27:16.271396734 +0000 UTC m=+1484.730824179" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.340225 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" path="/var/lib/kubelet/pods/eb79a9e1-8101-4bd1-8c03-9d4d527b9203/volumes" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.437870 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.543505 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.543831 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.543909 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.544029 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbx4\" (UniqueName: \"kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.544184 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.544260 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.544308 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs\") pod \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\" (UID: \"d131ebfd-7b8e-441d-bd94-b3d52465ae15\") " Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.564430 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4" (OuterVolumeSpecName: "kube-api-access-swbx4") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "kube-api-access-swbx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.573240 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.655225 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbx4\" (UniqueName: \"kubernetes.io/projected/d131ebfd-7b8e-441d-bd94-b3d52465ae15-kube-api-access-swbx4\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.655271 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.752789 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config" (OuterVolumeSpecName: "config") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.760631 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.760762 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.768210 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.873997 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.874424 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.932507 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.979969 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:16 crc kubenswrapper[4912]: I0318 13:27:16.982455 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d131ebfd-7b8e-441d-bd94-b3d52465ae15" (UID: "d131ebfd-7b8e-441d-bd94-b3d52465ae15"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.082177 4912 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d131ebfd-7b8e-441d-bd94-b3d52465ae15-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.449478 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerStarted","Data":"9d5b39e5014ae47486a17368c9235b3455c54be065a9000874141d96744478f9"} Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.510632 4912 generic.go:334] "Generic (PLEG): container finished" podID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerID="0b0d640751d2bc787e4ace3b0fdf3dfd60f52d033fee6c303d17b93c5acdfa11" exitCode=0 Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.510715 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerDied","Data":"0b0d640751d2bc787e4ace3b0fdf3dfd60f52d033fee6c303d17b93c5acdfa11"} Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.536111 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69b7f8c44f-4zs4t" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.536352 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69b7f8c44f-4zs4t" event={"ID":"d131ebfd-7b8e-441d-bd94-b3d52465ae15","Type":"ContainerDied","Data":"21b1895d5ed7473eaf8214266c750af09941bf2d62088f819e99cfd42d5fd80d"} Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.536476 4912 scope.go:117] "RemoveContainer" containerID="9ce013bedffcdb061ece7ba64de1d7a0ecab72853f02f5494a44e429e88dcb81" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.556166 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerStarted","Data":"7b5dfd0437ff56f3ad7e0ec7e039c3c6789dfbbc41231f5883c5919f177d2635"} Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.556241 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api-log" containerID="cri-o://614faa5191b340f6797d0038f10cb9d79eae2eb421c3f233a8a40dabbb8273d9" gracePeriod=30 Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.556509 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" containerID="cri-o://7b5dfd0437ff56f3ad7e0ec7e039c3c6789dfbbc41231f5883c5919f177d2635" gracePeriod=30 Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.556977 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.608945 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.608912273 podStartE2EDuration="7.608912273s" podCreationTimestamp="2026-03-18 13:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:17.598530084 +0000 UTC m=+1486.057957519" watchObservedRunningTime="2026-03-18 13:27:17.608912273 +0000 UTC m=+1486.068339698" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.648295 4912 scope.go:117] "RemoveContainer" containerID="19564a37a9b24261d7c9934eb493448b57a6b3b15e9b7b8a2666172c28b5565d" Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.674853 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:27:17 crc kubenswrapper[4912]: I0318 13:27:17.691446 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69b7f8c44f-4zs4t"] Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.264759 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" path="/var/lib/kubelet/pods/d131ebfd-7b8e-441d-bd94-b3d52465ae15/volumes" Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.484292 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.585821 4912 generic.go:334] "Generic (PLEG): container finished" podID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerID="614faa5191b340f6797d0038f10cb9d79eae2eb421c3f233a8a40dabbb8273d9" exitCode=143 Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.585910 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerDied","Data":"614faa5191b340f6797d0038f10cb9d79eae2eb421c3f233a8a40dabbb8273d9"} Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.591350 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerStarted","Data":"df563af7d6a312b666834d02dd62f09d6aa587bbcda731e8f4efca4f03ca2c00"} Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.599485 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerStarted","Data":"454c7be19d73679c256628d13e874e24bb5eeb28e155e4562045b0c2c18d55b5"} Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.629256 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.056910998 podStartE2EDuration="8.629232339s" podCreationTimestamp="2026-03-18 13:27:10 +0000 UTC" firstStartedPulling="2026-03-18 13:27:12.055991081 +0000 UTC m=+1480.515418506" lastFinishedPulling="2026-03-18 13:27:13.628312422 +0000 UTC m=+1482.087739847" observedRunningTime="2026-03-18 13:27:18.623113235 +0000 UTC m=+1487.082540670" watchObservedRunningTime="2026-03-18 13:27:18.629232339 +0000 UTC m=+1487.088659764" Mar 18 13:27:18 crc kubenswrapper[4912]: I0318 13:27:18.660055 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4p6pn" podStartSLOduration=5.333417617 podStartE2EDuration="9.660015875s" podCreationTimestamp="2026-03-18 13:27:09 +0000 UTC" firstStartedPulling="2026-03-18 13:27:13.803238057 +0000 UTC m=+1482.262665482" lastFinishedPulling="2026-03-18 13:27:18.129836315 +0000 UTC m=+1486.589263740" observedRunningTime="2026-03-18 13:27:18.648567188 +0000 UTC m=+1487.107994623" watchObservedRunningTime="2026-03-18 13:27:18.660015875 +0000 UTC m=+1487.119443300" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.018570 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-xkq2h" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.208:5353: i/o timeout" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.480981 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.481091 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.528327 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.528794 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:19 crc kubenswrapper[4912]: I0318 13:27:19.582965 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rvmbk" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:19 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:19 crc kubenswrapper[4912]: > Mar 18 13:27:20 crc kubenswrapper[4912]: I0318 13:27:20.534299 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:20 crc kubenswrapper[4912]: I0318 13:27:20.557635 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4p6pn" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:20 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:20 crc kubenswrapper[4912]: > Mar 18 13:27:20 crc kubenswrapper[4912]: I0318 13:27:20.744168 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 13:27:20 crc kubenswrapper[4912]: I0318 13:27:20.883310 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.009481 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.009919 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="dnsmasq-dns" containerID="cri-o://573326ca6595d80b8945733ff296bc5a27913cf458e237a64a43b1ddc6cc7109" gracePeriod=10 Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.704601 4912 generic.go:334] "Generic (PLEG): container finished" podID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerID="573326ca6595d80b8945733ff296bc5a27913cf458e237a64a43b1ddc6cc7109" exitCode=0 Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.705114 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" event={"ID":"66d0e89e-66f1-4da6-b974-4ab3c60b3520","Type":"ContainerDied","Data":"573326ca6595d80b8945733ff296bc5a27913cf458e237a64a43b1ddc6cc7109"} Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.906842 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:27:21 crc kubenswrapper[4912]: I0318 13:27:21.946365 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:21 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:21 crc kubenswrapper[4912]: > Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.099790 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.100491 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.101258 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.101310 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.101443 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.101486 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2g6\" (UniqueName: \"kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6\") pod \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\" (UID: \"66d0e89e-66f1-4da6-b974-4ab3c60b3520\") " Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.139016 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6" (OuterVolumeSpecName: "kube-api-access-vr2g6") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "kube-api-access-vr2g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.205312 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2g6\" (UniqueName: \"kubernetes.io/projected/66d0e89e-66f1-4da6-b974-4ab3c60b3520-kube-api-access-vr2g6\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.230925 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.308793 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.314986 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.315020 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.335470 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.346638 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config" (OuterVolumeSpecName: "config") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.386707 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66d0e89e-66f1-4da6-b974-4ab3c60b3520" (UID: "66d0e89e-66f1-4da6-b974-4ab3c60b3520"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.417920 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.417967 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.417981 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66d0e89e-66f1-4da6-b974-4ab3c60b3520-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.756378 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" event={"ID":"66d0e89e-66f1-4da6-b974-4ab3c60b3520","Type":"ContainerDied","Data":"5fbc86b4ea71b0a9a323ae52dcdee82592383b115430fdab475e8d3d8b6256d1"} Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.756875 4912 scope.go:117] "RemoveContainer" containerID="573326ca6595d80b8945733ff296bc5a27913cf458e237a64a43b1ddc6cc7109" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.757214 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zqbcj" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.843147 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.867330 4912 scope.go:117] "RemoveContainer" containerID="0235ce3e0286fa2bfa42a9399dc6d10ab8c54015656606c3f7805aa2190686da" Mar 18 13:27:22 crc kubenswrapper[4912]: I0318 13:27:22.873520 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zqbcj"] Mar 18 13:27:23 crc kubenswrapper[4912]: I0318 13:27:23.425534 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-59f85f449d-2mslp" podUID="c5262960-4228-43dd-a5d9-0fcdfe8111c3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.211:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:23 crc kubenswrapper[4912]: I0318 13:27:23.528249 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.260377 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" path="/var/lib/kubelet/pods/66d0e89e-66f1-4da6-b974-4ab3c60b3520/volumes" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.422305 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59f85f449d-2mslp" podUID="c5262960-4228-43dd-a5d9-0fcdfe8111c3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.211:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.422774 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59f85f449d-2mslp" podUID="c5262960-4228-43dd-a5d9-0fcdfe8111c3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.211:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.440839 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.613249 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.613577 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.633518 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:24 crc kubenswrapper[4912]: I0318 13:27:24.641256 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:25 crc kubenswrapper[4912]: I0318 13:27:25.746902 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": dial tcp 10.217.0.213:8080: connect: connection refused" Mar 18 13:27:25 crc kubenswrapper[4912]: I0318 13:27:25.888451 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:27:25 crc kubenswrapper[4912]: I0318 13:27:25.947892 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.091725 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f6bb888-s77j2" Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.166868 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.583345 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.909262 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-545f7ccb8-vvt42" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-log" containerID="cri-o://652095645d51d0cc2395a38caac6d84a53965396990722058dc23ae7de41adc8" gracePeriod=30 Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.909281 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-545f7ccb8-vvt42" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-api" containerID="cri-o://ae02311b5b822593fc668323470d6c25e6034495d3108df0352f6b04f8e73911" gracePeriod=30 Mar 18 13:27:26 crc kubenswrapper[4912]: I0318 13:27:26.934918 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-545f7ccb8-vvt42" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.204:8778/\": EOF" Mar 18 13:27:27 crc kubenswrapper[4912]: I0318 13:27:27.923851 4912 generic.go:334] "Generic (PLEG): container finished" podID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerID="652095645d51d0cc2395a38caac6d84a53965396990722058dc23ae7de41adc8" exitCode=143 Mar 18 13:27:27 crc kubenswrapper[4912]: I0318 13:27:27.924204 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerDied","Data":"652095645d51d0cc2395a38caac6d84a53965396990722058dc23ae7de41adc8"} Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.261911 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59f85f449d-2mslp" Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.278949 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79d7d9b7f7-jbmpn" Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.414275 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.416392 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" containerID="cri-o://2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b" gracePeriod=30 Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.417620 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" containerID="cri-o://5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b" gracePeriod=30 Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.963132 4912 generic.go:334] "Generic (PLEG): container finished" podID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerID="2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b" exitCode=143 Mar 18 13:27:28 crc kubenswrapper[4912]: I0318 13:27:28.963670 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerDied","Data":"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b"} Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163199 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163841 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-api" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163864 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-api" Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163885 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-httpd" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163894 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-httpd" Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163918 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="init" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163924 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="init" Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163953 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="init" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163959 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="init" Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163980 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.163986 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: E0318 13:27:29.163999 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.164006 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.164280 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb79a9e1-8101-4bd1-8c03-9d4d527b9203" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.164296 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d0e89e-66f1-4da6-b974-4ab3c60b3520" containerName="dnsmasq-dns" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.164315 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-api" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.164333 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d131ebfd-7b8e-441d-bd94-b3d52465ae15" containerName="neutron-httpd" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.165375 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.170252 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.170427 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.170831 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sgpbq" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.177915 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.258657 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.259393 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswqq\" (UniqueName: \"kubernetes.io/projected/311d61bd-9241-486c-a8d5-22fc93f208bc-kube-api-access-sswqq\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.259886 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.260268 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.363888 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.363967 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.364162 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.364242 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswqq\" (UniqueName: \"kubernetes.io/projected/311d61bd-9241-486c-a8d5-22fc93f208bc-kube-api-access-sswqq\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.366146 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.378258 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.378765 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d61bd-9241-486c-a8d5-22fc93f208bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.385102 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswqq\" (UniqueName: \"kubernetes.io/projected/311d61bd-9241-486c-a8d5-22fc93f208bc-kube-api-access-sswqq\") pod \"openstackclient\" (UID: \"311d61bd-9241-486c-a8d5-22fc93f208bc\") " pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.475745 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.510661 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:27:29 crc kubenswrapper[4912]: I0318 13:27:29.604849 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rvmbk" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:29 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:29 crc kubenswrapper[4912]: > Mar 18 13:27:30 crc kubenswrapper[4912]: I0318 13:27:30.352907 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:27:30 crc kubenswrapper[4912]: I0318 13:27:30.571993 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4p6pn" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:30 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:30 crc kubenswrapper[4912]: > Mar 18 13:27:31 crc kubenswrapper[4912]: I0318 13:27:31.005423 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"311d61bd-9241-486c-a8d5-22fc93f208bc","Type":"ContainerStarted","Data":"d20a726eec681c8f88a77e87345982804af4b00f5aaffce9430276ebbc995bb0"} Mar 18 13:27:31 crc kubenswrapper[4912]: I0318 13:27:31.266466 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:31 crc kubenswrapper[4912]: I0318 13:27:31.328567 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 13:27:31 crc kubenswrapper[4912]: I0318 13:27:31.414796 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:31 crc kubenswrapper[4912]: I0318 13:27:31.969291 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:31 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:31 crc kubenswrapper[4912]: > Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.097431 4912 generic.go:334] "Generic (PLEG): container finished" podID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerID="ae02311b5b822593fc668323470d6c25e6034495d3108df0352f6b04f8e73911" exitCode=0 Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.099610 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerDied","Data":"ae02311b5b822593fc668323470d6c25e6034495d3108df0352f6b04f8e73911"} Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.099913 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="cinder-scheduler" containerID="cri-o://9d5b39e5014ae47486a17368c9235b3455c54be065a9000874141d96744478f9" gracePeriod=30 Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.100730 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="probe" containerID="cri-o://df563af7d6a312b666834d02dd62f09d6aa587bbcda731e8f4efca4f03ca2c00" gracePeriod=30 Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.170122 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": read tcp 10.217.0.2:58020->10.217.0.209:9311: read: connection reset by peer" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.170502 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7689cffd58-6dzrs" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.209:9311/healthcheck\": read tcp 10.217.0.2:58022->10.217.0.209:9311: read: connection reset by peer" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.458454 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.602074 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889zf\" (UniqueName: \"kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.602719 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.602843 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.602989 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.603128 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.603234 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.603316 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs\") pod \"159b42fc-4ed2-409f-9bda-f68df79afb47\" (UID: \"159b42fc-4ed2-409f-9bda-f68df79afb47\") " Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.613580 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs" (OuterVolumeSpecName: "logs") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.642112 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf" (OuterVolumeSpecName: "kube-api-access-889zf") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "kube-api-access-889zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.642286 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts" (OuterVolumeSpecName: "scripts") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.718117 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.718165 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889zf\" (UniqueName: \"kubernetes.io/projected/159b42fc-4ed2-409f-9bda-f68df79afb47-kube-api-access-889zf\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.718182 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159b42fc-4ed2-409f-9bda-f68df79afb47-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.895752 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data" (OuterVolumeSpecName: "config-data") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.933091 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.946539 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.967118 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.968518 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:32 crc kubenswrapper[4912]: I0318 13:27:32.968601 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.035411 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.044812 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "159b42fc-4ed2-409f-9bda-f68df79afb47" (UID: "159b42fc-4ed2-409f-9bda-f68df79afb47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.070651 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle\") pod \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.070761 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds4vc\" (UniqueName: \"kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc\") pod \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.070841 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom\") pod \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.070940 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs\") pod \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.071187 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data\") pod \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\" (UID: \"7ca56c45-38cd-4c83-a1d3-44f505b6c402\") " Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.071844 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/159b42fc-4ed2-409f-9bda-f68df79afb47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.072910 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs" (OuterVolumeSpecName: "logs") pod "7ca56c45-38cd-4c83-a1d3-44f505b6c402" (UID: "7ca56c45-38cd-4c83-a1d3-44f505b6c402"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.075187 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ca56c45-38cd-4c83-a1d3-44f505b6c402" (UID: "7ca56c45-38cd-4c83-a1d3-44f505b6c402"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.082243 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc" (OuterVolumeSpecName: "kube-api-access-ds4vc") pod "7ca56c45-38cd-4c83-a1d3-44f505b6c402" (UID: "7ca56c45-38cd-4c83-a1d3-44f505b6c402"). InnerVolumeSpecName "kube-api-access-ds4vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.136745 4912 generic.go:334] "Generic (PLEG): container finished" podID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerID="5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b" exitCode=0 Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.136875 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerDied","Data":"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b"} Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.136913 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7689cffd58-6dzrs" event={"ID":"7ca56c45-38cd-4c83-a1d3-44f505b6c402","Type":"ContainerDied","Data":"9a7f5bd90d39b68a0dd6c7127e1a9288ca9d5bec9197c8d1f9be43af1113a055"} Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.136933 4912 scope.go:117] "RemoveContainer" containerID="5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.137227 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7689cffd58-6dzrs" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.153853 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ca56c45-38cd-4c83-a1d3-44f505b6c402" (UID: "7ca56c45-38cd-4c83-a1d3-44f505b6c402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.156293 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-545f7ccb8-vvt42" event={"ID":"159b42fc-4ed2-409f-9bda-f68df79afb47","Type":"ContainerDied","Data":"d41b09baed569265b952e2ad8f30c824550c33a4e08daf1e2b87fa2b61dba9fa"} Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.156422 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-545f7ccb8-vvt42" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.162607 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data" (OuterVolumeSpecName: "config-data") pod "7ca56c45-38cd-4c83-a1d3-44f505b6c402" (UID: "7ca56c45-38cd-4c83-a1d3-44f505b6c402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.179619 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.179790 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.185407 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds4vc\" (UniqueName: \"kubernetes.io/projected/7ca56c45-38cd-4c83-a1d3-44f505b6c402-kube-api-access-ds4vc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.185461 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca56c45-38cd-4c83-a1d3-44f505b6c402-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.185475 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ca56c45-38cd-4c83-a1d3-44f505b6c402-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.231691 4912 scope.go:117] "RemoveContainer" containerID="2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.245483 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.260802 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-545f7ccb8-vvt42"] Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.311195 4912 scope.go:117] "RemoveContainer" containerID="5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b" Mar 18 13:27:33 crc kubenswrapper[4912]: E0318 13:27:33.316488 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b\": container with ID starting with 5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b not found: ID does not exist" containerID="5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.316576 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b"} err="failed to get container status \"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b\": rpc error: code = NotFound desc = could not find container \"5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b\": container with ID starting with 5b6956873c0daf742b985a8628d4c0234f4f8a894896873acfbdc5614c99b15b not found: ID does not exist" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.316637 4912 scope.go:117] "RemoveContainer" containerID="2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b" Mar 18 13:27:33 crc kubenswrapper[4912]: E0318 13:27:33.319071 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b\": container with ID starting with 2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b not found: ID does not exist" containerID="2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.319118 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b"} err="failed to get container status \"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b\": rpc error: code = NotFound desc = could not find container \"2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b\": container with ID starting with 2b5dc284c45d105959a4754c9441ab90530369773990a6f07b18068848fc982b not found: ID does not exist" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.319136 4912 scope.go:117] "RemoveContainer" containerID="ae02311b5b822593fc668323470d6c25e6034495d3108df0352f6b04f8e73911" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.426294 4912 scope.go:117] "RemoveContainer" containerID="652095645d51d0cc2395a38caac6d84a53965396990722058dc23ae7de41adc8" Mar 18 13:27:33 crc kubenswrapper[4912]: E0318 13:27:33.468474 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ca56c45_38cd_4c83_a1d3_44f505b6c402.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ca56c45_38cd_4c83_a1d3_44f505b6c402.slice/crio-9a7f5bd90d39b68a0dd6c7127e1a9288ca9d5bec9197c8d1f9be43af1113a055\": RecentStats: unable to find data in memory cache]" Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.506401 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:33 crc kubenswrapper[4912]: I0318 13:27:33.540478 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7689cffd58-6dzrs"] Mar 18 13:27:34 crc kubenswrapper[4912]: I0318 13:27:34.175061 4912 generic.go:334] "Generic (PLEG): container finished" podID="8423b334-7b23-4086-b08f-22ac5782729c" containerID="df563af7d6a312b666834d02dd62f09d6aa587bbcda731e8f4efca4f03ca2c00" exitCode=0 Mar 18 13:27:34 crc kubenswrapper[4912]: I0318 13:27:34.175152 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerDied","Data":"df563af7d6a312b666834d02dd62f09d6aa587bbcda731e8f4efca4f03ca2c00"} Mar 18 13:27:34 crc kubenswrapper[4912]: I0318 13:27:34.251829 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" path="/var/lib/kubelet/pods/159b42fc-4ed2-409f-9bda-f68df79afb47/volumes" Mar 18 13:27:34 crc kubenswrapper[4912]: I0318 13:27:34.252797 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" path="/var/lib/kubelet/pods/7ca56c45-38cd-4c83-a1d3-44f505b6c402/volumes" Mar 18 13:27:34 crc kubenswrapper[4912]: I0318 13:27:34.781108 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.203085 4912 generic.go:334] "Generic (PLEG): container finished" podID="8423b334-7b23-4086-b08f-22ac5782729c" containerID="9d5b39e5014ae47486a17368c9235b3455c54be065a9000874141d96744478f9" exitCode=0 Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.203108 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerDied","Data":"9d5b39e5014ae47486a17368c9235b3455c54be065a9000874141d96744478f9"} Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.814832 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865070 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865170 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd2xl\" (UniqueName: \"kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865234 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865348 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865487 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865610 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts\") pod \"8423b334-7b23-4086-b08f-22ac5782729c\" (UID: \"8423b334-7b23-4086-b08f-22ac5782729c\") " Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.865748 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.866312 4912 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8423b334-7b23-4086-b08f-22ac5782729c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.890544 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.890936 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts" (OuterVolumeSpecName: "scripts") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.902911 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl" (OuterVolumeSpecName: "kube-api-access-rd2xl") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "kube-api-access-rd2xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.969831 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.969945 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd2xl\" (UniqueName: \"kubernetes.io/projected/8423b334-7b23-4086-b08f-22ac5782729c-kube-api-access-rd2xl\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.969961 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:35 crc kubenswrapper[4912]: I0318 13:27:35.970216 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.074061 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.088360 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data" (OuterVolumeSpecName: "config-data") pod "8423b334-7b23-4086-b08f-22ac5782729c" (UID: "8423b334-7b23-4086-b08f-22ac5782729c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.128937 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.176974 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177100 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177410 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l5r2\" (UniqueName: \"kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177505 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177534 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177623 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.177807 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle\") pod \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\" (UID: \"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c\") " Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.178641 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423b334-7b23-4086-b08f-22ac5782729c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.178694 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.179273 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.183345 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2" (OuterVolumeSpecName: "kube-api-access-9l5r2") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "kube-api-access-9l5r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.186801 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts" (OuterVolumeSpecName: "scripts") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.227518 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.234159 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.267231 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.272981 4912 generic.go:334] "Generic (PLEG): container finished" podID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerID="e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870" exitCode=137 Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.273266 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294492 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294552 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l5r2\" (UniqueName: \"kubernetes.io/projected/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-kube-api-access-9l5r2\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294568 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294581 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294619 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.294632 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.295780 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data" (OuterVolumeSpecName: "config-data") pod "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" (UID: "509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.319332 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8423b334-7b23-4086-b08f-22ac5782729c","Type":"ContainerDied","Data":"6d5b512aaf5c32a3b9e7e2fdbc51fc53b7442e959a1698ddeb163d3ac5d6b381"} Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.319412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerDied","Data":"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870"} Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.319436 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c","Type":"ContainerDied","Data":"cc4d4b96a8d253f094a532d99734e82074ec72609ce62778cb6ee66c80ec75ec"} Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.319464 4912 scope.go:117] "RemoveContainer" containerID="df563af7d6a312b666834d02dd62f09d6aa587bbcda731e8f4efca4f03ca2c00" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.333243 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.378798 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.394705 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395276 4912 scope.go:117] "RemoveContainer" containerID="9d5b39e5014ae47486a17368c9235b3455c54be065a9000874141d96744478f9" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395451 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="cinder-scheduler" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395473 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="cinder-scheduler" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395499 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="probe" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395509 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="probe" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395522 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395531 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395549 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="sg-core" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395555 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="sg-core" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395570 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-api" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395576 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-api" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395585 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="proxy-httpd" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395592 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="proxy-httpd" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395613 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-log" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395620 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-log" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.395633 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395641 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395877 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-log" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395908 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api-log" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395919 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="proxy-httpd" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395927 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" containerName="sg-core" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395933 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca56c45-38cd-4c83-a1d3-44f505b6c402" containerName="barbican-api" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395953 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="159b42fc-4ed2-409f-9bda-f68df79afb47" containerName="placement-api" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395962 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="cinder-scheduler" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.395978 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8423b334-7b23-4086-b08f-22ac5782729c" containerName="probe" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.396693 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.397965 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.402481 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.414014 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.497748 4912 scope.go:117] "RemoveContainer" containerID="e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.501655 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.501854 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsf9\" (UniqueName: \"kubernetes.io/projected/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-kube-api-access-7xsf9\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.502140 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.502183 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.502260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.502779 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.542771 4912 scope.go:117] "RemoveContainer" containerID="b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606106 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsf9\" (UniqueName: \"kubernetes.io/projected/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-kube-api-access-7xsf9\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606217 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606242 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606271 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606381 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.606509 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.608630 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.610226 4912 scope.go:117] "RemoveContainer" containerID="e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.611371 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870\": container with ID starting with e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870 not found: ID does not exist" containerID="e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.611411 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870"} err="failed to get container status \"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870\": rpc error: code = NotFound desc = could not find container \"e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870\": container with ID starting with e2bc1dec7188823ccb972768a2ae29a7591b01f9a521798c133f39be950c7870 not found: ID does not exist" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.611459 4912 scope.go:117] "RemoveContainer" containerID="b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5" Mar 18 13:27:36 crc kubenswrapper[4912]: E0318 13:27:36.615424 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5\": container with ID starting with b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5 not found: ID does not exist" containerID="b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.615506 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5"} err="failed to get container status \"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5\": rpc error: code = NotFound desc = could not find container \"b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5\": container with ID starting with b4d1928444f5738f278424086e23041f64b034f4280384ec5c7b7e61391fa9b5 not found: ID does not exist" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.617218 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.617573 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.618628 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.619916 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.635748 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsf9\" (UniqueName: \"kubernetes.io/projected/ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc-kube-api-access-7xsf9\") pod \"cinder-scheduler-0\" (UID: \"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc\") " pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.647199 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.672722 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.693934 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.699574 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.702006 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.705789 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.705969 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.709900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.709944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.710031 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.710118 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsm6f\" (UniqueName: \"kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.710265 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.710394 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.710441 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.729764 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.813260 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.813837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814382 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814467 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814552 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814581 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814660 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.814734 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsm6f\" (UniqueName: \"kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.816340 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.820848 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.822072 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.823175 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.831202 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:36 crc kubenswrapper[4912]: I0318 13:27:36.862242 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsm6f\" (UniqueName: \"kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f\") pod \"ceilometer-0\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " pod="openstack/ceilometer-0" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.000617 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.000711 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.014246 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.093760 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.097900 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.122183 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.122456 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-cd8zm" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.122600 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.125003 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flc5\" (UniqueName: \"kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.125200 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.125288 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.125396 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.160053 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.238266 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4flc5\" (UniqueName: \"kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.238371 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.238427 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.238492 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.252709 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.256780 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.276633 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.277010 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.281093 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.288520 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.307797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flc5\" (UniqueName: \"kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5\") pod \"heat-engine-bb6dbd969-rqk8b\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.356940 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.374123 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.382230 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.434182 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.453927 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.454273 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.454662 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.454839 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.455346 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvfz\" (UniqueName: \"kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.455929 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.459095 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.462629 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.472867 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.498863 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.508455 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558082 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558157 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558209 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558253 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq986\" (UniqueName: \"kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558329 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558348 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558396 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558421 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvfz\" (UniqueName: \"kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558450 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.558516 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.561248 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.561939 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.563152 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.564006 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.577690 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.579138 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.620152 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvfz\" (UniqueName: \"kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz\") pod \"dnsmasq-dns-688b9f5b49-bcs29\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.641827 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.677847 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.677963 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678227 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678305 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678396 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wc68\" (UniqueName: \"kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678579 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq986\" (UniqueName: \"kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678714 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.678868 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.716525 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.717960 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.719299 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.724027 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq986\" (UniqueName: \"kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986\") pod \"heat-cfnapi-648f5d994b-9xb5t\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.789483 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.789728 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.789898 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.789953 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wc68\" (UniqueName: \"kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.818397 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.830274 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.849618 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wc68\" (UniqueName: \"kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.905862 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle\") pod \"heat-api-7bf7bf4c9b-55dtz\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:37 crc kubenswrapper[4912]: I0318 13:27:37.913139 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.015838 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.073657 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d86cc5c8f-fqc82" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.117826 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.177196 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.199634 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bcdccd79d-bvwsd" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-api" containerID="cri-o://8bfe9b0ce31606171c776b87ff85fb45661e3ca780a21a8abe4ed0eedfd5ee11" gracePeriod=30 Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.200526 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bcdccd79d-bvwsd" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-httpd" containerID="cri-o://c02851869e590595fd381d5a03d0cba2f3ba9966778d483d5c2d3e985081f5c2" gracePeriod=30 Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.294501 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c" path="/var/lib/kubelet/pods/509bd3c2-7fb9-4c2d-bcc4-4878f5eed23c/volumes" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.296198 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8423b334-7b23-4086-b08f-22ac5782729c" path="/var/lib/kubelet/pods/8423b334-7b23-4086-b08f-22ac5782729c/volumes" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.378009 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerStarted","Data":"571e5f754fd76954bdaf1f9507cb3d3243e842bb770502bc4833c2934d8c720d"} Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.400026 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc","Type":"ContainerStarted","Data":"fc55e0e74d5c71ac05e0c9de7e581312b8e9ebeacb09d053afbd7898392f1a88"} Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.613330 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:27:38 crc kubenswrapper[4912]: W0318 13:27:38.648471 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod802b4150_7c77_4913_9bd5_94cb3ecf7895.slice/crio-554ed5b43999a761b436554ca012322af6a978c53f795d7ddb34fb9b4e6e5cd6 WatchSource:0}: Error finding container 554ed5b43999a761b436554ca012322af6a978c53f795d7ddb34fb9b4e6e5cd6: Status 404 returned error can't find the container with id 554ed5b43999a761b436554ca012322af6a978c53f795d7ddb34fb9b4e6e5cd6 Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.673888 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:27:38 crc kubenswrapper[4912]: I0318 13:27:38.905324 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.022268 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.409300 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.446163 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.489522 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bb6dbd969-rqk8b" event={"ID":"802b4150-7c77-4913-9bd5-94cb3ecf7895","Type":"ContainerStarted","Data":"554ed5b43999a761b436554ca012322af6a978c53f795d7ddb34fb9b4e6e5cd6"} Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.531107 4912 generic.go:334] "Generic (PLEG): container finished" podID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerID="c02851869e590595fd381d5a03d0cba2f3ba9966778d483d5c2d3e985081f5c2" exitCode=0 Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.531227 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerDied","Data":"c02851869e590595fd381d5a03d0cba2f3ba9966778d483d5c2d3e985081f5c2"} Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.553085 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" event={"ID":"6eca8a13-092c-4ab7-8c93-a91e352f2ad0","Type":"ContainerStarted","Data":"1680f82c93e174f4a687deddc1ffd499d1f91f2e80d8f6ed8954cff10563c443"} Mar 18 13:27:39 crc kubenswrapper[4912]: I0318 13:27:39.993314 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.610380 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerID="b9057db229a35c2f95d9a2edbb01db085519827120f7d1cd65dc62c82fb312b6" exitCode=0 Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.611155 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" event={"ID":"6eca8a13-092c-4ab7-8c93-a91e352f2ad0","Type":"ContainerDied","Data":"b9057db229a35c2f95d9a2edbb01db085519827120f7d1cd65dc62c82fb312b6"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.611502 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f59b977c9-rwwx4"] Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.615900 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.627213 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4p6pn" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:40 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:40 crc kubenswrapper[4912]: > Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.627607 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.627680 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.627878 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.638469 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bb6dbd969-rqk8b" event={"ID":"802b4150-7c77-4913-9bd5-94cb3ecf7895","Type":"ContainerStarted","Data":"89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.639938 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.640875 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f59b977c9-rwwx4"] Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646270 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-log-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646345 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-config-data\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646379 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-internal-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646432 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-combined-ca-bundle\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646487 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-public-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646570 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-etc-swift\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646681 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-run-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.646795 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvhj\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-kube-api-access-nfvhj\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.653757 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" event={"ID":"7f1211a1-9d4a-474f-a2c1-f5f8777e5733","Type":"ContainerStarted","Data":"b895c91c3ee0fc28c3d355983f5242c3af0225a7c8274a815083f6e81bdba1ee"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.695264 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc","Type":"ContainerStarted","Data":"7ee1cc1a6d2764eff6597a2ce8080b1c63edb6d412c882b12e17490f7db146c1"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.718379 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerStarted","Data":"b60fb01193de9b6c4e664548c173dfcafda38de6a6fe99127d17dff1020e669c"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.721609 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-bb6dbd969-rqk8b" podStartSLOduration=3.7215806540000003 podStartE2EDuration="3.721580654s" podCreationTimestamp="2026-03-18 13:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:40.68791072 +0000 UTC m=+1509.147338155" watchObservedRunningTime="2026-03-18 13:27:40.721580654 +0000 UTC m=+1509.181008079" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.730325 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvmbk" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" containerID="cri-o://3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418" gracePeriod=2 Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.731009 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7bf4c9b-55dtz" event={"ID":"cbda5023-7a4c-4e65-951e-545c8dc7ec49","Type":"ContainerStarted","Data":"79c2c8a89de99574d3470e3d25ca74e83573a4bb52d824ed6e55bedc69479221"} Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.755489 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-run-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.755713 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvhj\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-kube-api-access-nfvhj\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.755837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-log-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.755956 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-config-data\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.756023 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-internal-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.758593 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-combined-ca-bundle\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.758700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-run-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.758748 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-public-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.758931 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-etc-swift\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.776888 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-etc-swift\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.787722 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-log-httpd\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.791293 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-public-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.791682 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-config-data\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.792510 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-combined-ca-bundle\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.793205 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-internal-tls-certs\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.802749 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvhj\" (UniqueName: \"kubernetes.io/projected/08a4effe-9a7e-449c-aba4-74d4b7a4f0ae-kube-api-access-nfvhj\") pod \"swift-proxy-6f59b977c9-rwwx4\" (UID: \"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae\") " pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:40 crc kubenswrapper[4912]: I0318 13:27:40.981240 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:27:41 crc kubenswrapper[4912]: I0318 13:27:41.266917 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.730564 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.791389 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jhw4z"] Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:41.792185 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.792206 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:41.792232 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="extract-content" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.792244 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="extract-content" Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:41.792311 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="extract-utilities" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.792336 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="extract-utilities" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.792639 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerName="registry-server" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.800800 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.852530 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jhw4z"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.859206 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" event={"ID":"6eca8a13-092c-4ab7-8c93-a91e352f2ad0","Type":"ContainerStarted","Data":"b8aec3cb31cff2d6421d2242c7bd7db74e0a2f4848f3998afd72e1bdd93798cd"} Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.859287 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.900858 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc","Type":"ContainerStarted","Data":"6190911ff133cd1ad8f19b2447a6897b425c631cc546def186ffe352123a88f4"} Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.919794 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerStarted","Data":"437dbc9666d9aa473bc31e046629372456abdf86c3ac6d228b5edbbdc6615445"} Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.928727 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities\") pod \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.928799 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content\") pod \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.929166 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt66x\" (UniqueName: \"kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x\") pod \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\" (UID: \"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8\") " Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.930120 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.932753 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gn8g\" (UniqueName: \"kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.935343 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities" (OuterVolumeSpecName: "utilities") pod "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" (UID: "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.942646 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mpxrh"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.960568 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x" (OuterVolumeSpecName: "kube-api-access-vt66x") pod "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" (UID: "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8"). InnerVolumeSpecName "kube-api-access-vt66x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.962195 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.964633 4912 generic.go:334] "Generic (PLEG): container finished" podID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" containerID="3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418" exitCode=0 Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.966084 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmbk" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.966340 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerDied","Data":"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418"} Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.966376 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmbk" event={"ID":"3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8","Type":"ContainerDied","Data":"c68ff814631878a519de1a06ad7f6c8d1d17fd7c979646aab90c964df046a9a7"} Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:41.966403 4912 scope.go:117] "RemoveContainer" containerID="3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.034966 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mpxrh"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.035758 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gn8g\" (UniqueName: \"kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.035911 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.045313 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.045376 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.045621 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt66x\" (UniqueName: \"kubernetes.io/projected/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-kube-api-access-vt66x\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.064518 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e151-account-create-update-9552s"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.080247 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.081633 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" (UID: "3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.095457 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.095746 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gn8g\" (UniqueName: \"kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g\") pod \"nova-api-db-create-jhw4z\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.113901 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e151-account-create-update-9552s"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.155899 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.157807 4912 scope.go:117] "RemoveContainer" containerID="e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.156027 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4fk\" (UniqueName: \"kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.162239 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.174519 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" podStartSLOduration=5.17448673 podStartE2EDuration="5.17448673s" podCreationTimestamp="2026-03-18 13:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:41.901023071 +0000 UTC m=+1510.360450496" watchObservedRunningTime="2026-03-18 13:27:42.17448673 +0000 UTC m=+1510.633914155" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.186475 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.223131 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:42 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:42 crc kubenswrapper[4912]: > Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.232968 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.232941129 podStartE2EDuration="6.232941129s" podCreationTimestamp="2026-03-18 13:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:41.946885702 +0000 UTC m=+1510.406313167" watchObservedRunningTime="2026-03-18 13:27:42.232941129 +0000 UTC m=+1510.692368564" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.265431 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.265587 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.265618 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb82d\" (UniqueName: \"kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.275843 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4fk\" (UniqueName: \"kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.292911 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.406416 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.406484 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb82d\" (UniqueName: \"kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.443409 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4fk\" (UniqueName: \"kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk\") pod \"nova-cell0-db-create-mpxrh\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.463935 4912 scope.go:117] "RemoveContainer" containerID="57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.477505 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb82d\" (UniqueName: \"kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.494217 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts\") pod \"nova-api-e151-account-create-update-9552s\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.576806 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8prkm"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.578698 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8prkm"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.578722 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e03c-account-create-update-vtvfd"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.580278 4912 scope.go:117] "RemoveContainer" containerID="3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.581754 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:42.583433 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418\": container with ID starting with 3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418 not found: ID does not exist" containerID="3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.583503 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418"} err="failed to get container status \"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418\": rpc error: code = NotFound desc = could not find container \"3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418\": container with ID starting with 3c0978d67a0d5717766b2e526cf369a8b64bd02934fe5f56c098d90b1e57b418 not found: ID does not exist" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.583538 4912 scope.go:117] "RemoveContainer" containerID="e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035" Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:42.584753 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035\": container with ID starting with e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035 not found: ID does not exist" containerID="e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.584801 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035"} err="failed to get container status \"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035\": rpc error: code = NotFound desc = could not find container \"e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035\": container with ID starting with e2fc19c2aee260e53bdabc44647b9cfd72089cb64ca2c616fa1c4d471bb60035 not found: ID does not exist" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.584835 4912 scope.go:117] "RemoveContainer" containerID="57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e" Mar 18 13:27:42 crc kubenswrapper[4912]: E0318 13:27:42.587159 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e\": container with ID starting with 57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e not found: ID does not exist" containerID="57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.587189 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e"} err="failed to get container status \"57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e\": rpc error: code = NotFound desc = could not find container \"57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e\": container with ID starting with 57b0fa5018bd990f8a6ccc9c82932ea25d453077a813a22b9d5f8b98e3d0e88e not found: ID does not exist" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.605859 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.609593 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.628251 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e03c-account-create-update-vtvfd"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.638203 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.654717 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.679399 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4ce1-account-create-update-b9dnq"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.681620 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.690438 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.704914 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ce1-account-create-update-b9dnq"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.737215 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.749447 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmxh\" (UniqueName: \"kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.749975 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.750224 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lkdg\" (UniqueName: \"kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.852526 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lkdg\" (UniqueName: \"kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.853070 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.853139 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmxh\" (UniqueName: \"kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.853250 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84nn\" (UniqueName: \"kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.853277 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.853309 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.856652 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.861847 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.868269 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.887828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lkdg\" (UniqueName: \"kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg\") pod \"nova-cell0-e03c-account-create-update-vtvfd\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.887934 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvmbk"] Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.925663 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmxh\" (UniqueName: \"kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh\") pod \"nova-cell1-db-create-8prkm\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.938584 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.956133 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84nn\" (UniqueName: \"kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.956184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.960121 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.980687 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:27:42 crc kubenswrapper[4912]: I0318 13:27:42.986535 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84nn\" (UniqueName: \"kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn\") pod \"nova-cell1-4ce1-account-create-update-b9dnq\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:43 crc kubenswrapper[4912]: I0318 13:27:43.040912 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:27:43 crc kubenswrapper[4912]: I0318 13:27:43.165913 4912 generic.go:334] "Generic (PLEG): container finished" podID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerID="8bfe9b0ce31606171c776b87ff85fb45661e3ca780a21a8abe4ed0eedfd5ee11" exitCode=0 Mar 18 13:27:43 crc kubenswrapper[4912]: I0318 13:27:43.168657 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerDied","Data":"8bfe9b0ce31606171c776b87ff85fb45661e3ca780a21a8abe4ed0eedfd5ee11"} Mar 18 13:27:43 crc kubenswrapper[4912]: I0318 13:27:43.846423 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f59b977c9-rwwx4"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.416560 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8" path="/var/lib/kubelet/pods/3de9cd0d-9467-4cb6-9bf2-3c283afaa2a8/volumes" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.418552 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerStarted","Data":"79cf2123c3fb6756becda720c0fad650ca043f1d0fb4eedde3f6548ccc9dc910"} Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.485677 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jhw4z"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.516146 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mpxrh"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.542344 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e151-account-create-update-9552s"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.561987 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e03c-account-create-update-vtvfd"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.638954 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.641098 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.689778 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.691994 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.706479 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.730382 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.749698 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.751649 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.798707 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.799901 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.799962 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.799995 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.800025 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.800191 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg7gt\" (UniqueName: \"kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.800287 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.800410 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qssx8\" (UniqueName: \"kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.823404 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.905214 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbplg\" (UniqueName: \"kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.905505 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.905607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.905768 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906108 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906156 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906208 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906247 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906276 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906443 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg7gt\" (UniqueName: \"kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906464 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.906573 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qssx8\" (UniqueName: \"kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.917089 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.918455 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.928850 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.931063 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.931659 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qssx8\" (UniqueName: \"kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.942614 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.968461 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom\") pod \"heat-engine-684f5dccdc-5k4b9\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:44 crc kubenswrapper[4912]: I0318 13:27:44.989183 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg7gt\" (UniqueName: \"kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt\") pod \"heat-api-6f768dfb8d-bq88g\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.010153 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbplg\" (UniqueName: \"kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.010274 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.010340 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.010479 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.022209 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.027221 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.033843 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.043393 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.044051 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbplg\" (UniqueName: \"kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg\") pod \"heat-cfnapi-7f6459544-dzg22\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.048606 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ce1-account-create-update-b9dnq"] Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.050532 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.061330 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8prkm"] Mar 18 13:27:45 crc kubenswrapper[4912]: I0318 13:27:45.098669 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:27:46 crc kubenswrapper[4912]: W0318 13:27:46.106584 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72532937_9ae3_416d_968f_6a7031ec3055.slice/crio-3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f WatchSource:0}: Error finding container 3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f: Status 404 returned error can't find the container with id 3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.203368 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.262109 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.310964 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.344502 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcdccd79d-bvwsd" event={"ID":"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967","Type":"ContainerDied","Data":"4c06223e1c3f08f0c74c841cf77654c2c44a1308255b110c4c6bdcf3173acce2"} Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.344950 4912 scope.go:117] "RemoveContainer" containerID="c02851869e590595fd381d5a03d0cba2f3ba9966778d483d5c2d3e985081f5c2" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.345287 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcdccd79d-bvwsd" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.354772 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8prkm" event={"ID":"72532937-9ae3-416d-968f-6a7031ec3055","Type":"ContainerStarted","Data":"3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f"} Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359419 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config\") pod \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359488 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config\") pod \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359562 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle\") pod \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359712 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs\") pod \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359875 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mpxrh" event={"ID":"e1108851-b127-4ccb-8c81-bfbe9de7267e","Type":"ContainerStarted","Data":"944eea457a4d562321b02809b170da469f291eeb3e003fc50a477274463f725d"} Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.359941 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jml2\" (UniqueName: \"kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2\") pod \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\" (UID: \"e3df1cac-4f07-4be6-9c80-dcdd7b6c1967\") " Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.384448 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2" (OuterVolumeSpecName: "kube-api-access-6jml2") pod "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" (UID: "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967"). InnerVolumeSpecName "kube-api-access-6jml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.403497 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" (UID: "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.485323 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jml2\" (UniqueName: \"kubernetes.io/projected/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-kube-api-access-6jml2\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.485373 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.491830 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config" (OuterVolumeSpecName: "config") pod "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" (UID: "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.505278 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" (UID: "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.536363 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" (UID: "e3df1cac-4f07-4be6-9c80-dcdd7b6c1967"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.592633 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.592880 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.592941 4912 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.730539 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.770984 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:27:46 crc kubenswrapper[4912]: I0318 13:27:46.790485 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bcdccd79d-bvwsd"] Mar 18 13:27:47 crc kubenswrapper[4912]: W0318 13:27:47.141449 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d6235cc_b662_414c_97c5_0f5b3550d605.slice/crio-33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c WatchSource:0}: Error finding container 33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c: Status 404 returned error can't find the container with id 33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c Mar 18 13:27:47 crc kubenswrapper[4912]: W0318 13:27:47.146235 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a4effe_9a7e_449c_aba4_74d4b7a4f0ae.slice/crio-5875c3c093567ea95d62e2863ec3c8009ba79366dac55c95a27e588b75cbe5de WatchSource:0}: Error finding container 5875c3c093567ea95d62e2863ec3c8009ba79366dac55c95a27e588b75cbe5de: Status 404 returned error can't find the container with id 5875c3c093567ea95d62e2863ec3c8009ba79366dac55c95a27e588b75cbe5de Mar 18 13:27:47 crc kubenswrapper[4912]: W0318 13:27:47.160730 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc0d3692_3e21_4b00_9629_f5a4d2140ca9.slice/crio-e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a WatchSource:0}: Error finding container e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a: Status 404 returned error can't find the container with id e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a Mar 18 13:27:47 crc kubenswrapper[4912]: W0318 13:27:47.162435 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7499c8bd_8342_41ca_933a_d0975f9d18e5.slice/crio-19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25 WatchSource:0}: Error finding container 19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25: Status 404 returned error can't find the container with id 19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25 Mar 18 13:27:47 crc kubenswrapper[4912]: W0318 13:27:47.166878 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba0303c0_2058_42de_850b_ea7214f3900c.slice/crio-9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94 WatchSource:0}: Error finding container 9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94: Status 404 returned error can't find the container with id 9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94 Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.222092 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.416212 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e151-account-create-update-9552s" event={"ID":"2d6235cc-b662-414c-97c5-0f5b3550d605","Type":"ContainerStarted","Data":"33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c"} Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.426718 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jhw4z" event={"ID":"ba0303c0-2058-42de-850b-ea7214f3900c","Type":"ContainerStarted","Data":"9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94"} Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.430378 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" event={"ID":"7499c8bd-8342-41ca-933a-d0975f9d18e5","Type":"ContainerStarted","Data":"19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25"} Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.435285 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" event={"ID":"cc0d3692-3e21-4b00-9629-f5a4d2140ca9","Type":"ContainerStarted","Data":"e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a"} Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.439203 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f59b977c9-rwwx4" event={"ID":"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae","Type":"ContainerStarted","Data":"5875c3c093567ea95d62e2863ec3c8009ba79366dac55c95a27e588b75cbe5de"} Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.645953 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.766938 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:47 crc kubenswrapper[4912]: I0318 13:27:47.767831 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" containerID="cri-o://a162166d3edc118ef7452a18bc245f968a9ef84696a13788ba367fa2487f1f5c" gracePeriod=10 Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.267965 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" path="/var/lib/kubelet/pods/e3df1cac-4f07-4be6-9c80-dcdd7b6c1967/volumes" Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.457358 4912 generic.go:334] "Generic (PLEG): container finished" podID="1331546f-e949-4d01-97fa-48a28a165bec" containerID="a162166d3edc118ef7452a18bc245f968a9ef84696a13788ba367fa2487f1f5c" exitCode=0 Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.457480 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" event={"ID":"1331546f-e949-4d01-97fa-48a28a165bec","Type":"ContainerDied","Data":"a162166d3edc118ef7452a18bc245f968a9ef84696a13788ba367fa2487f1f5c"} Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.461858 4912 generic.go:334] "Generic (PLEG): container finished" podID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerID="7b5dfd0437ff56f3ad7e0ec7e039c3c6789dfbbc41231f5883c5919f177d2635" exitCode=137 Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.461934 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerDied","Data":"7b5dfd0437ff56f3ad7e0ec7e039c3c6789dfbbc41231f5883c5919f177d2635"} Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.946442 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.994981 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:27:48 crc kubenswrapper[4912]: E0318 13:27:48.995756 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-api" Mar 18 13:27:48 crc kubenswrapper[4912]: I0318 13:27:48.995784 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-api" Mar 18 13:27:49 crc kubenswrapper[4912]: E0318 13:27:48.995827 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-httpd" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.004066 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-httpd" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.004884 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-api" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.004916 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3df1cac-4f07-4be6-9c80-dcdd7b6c1967" containerName="neutron-httpd" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.006572 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.010665 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.013979 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.024480 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.088301 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.161153 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.165458 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.171652 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.172903 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.191419 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lc7t\" (UniqueName: \"kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.191518 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.191843 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.192112 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.192242 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.192613 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.210909 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.297871 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.297978 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lc7t\" (UniqueName: \"kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298019 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298085 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298131 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298195 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298231 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298282 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298326 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lws6r\" (UniqueName: \"kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298418 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298455 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.298636 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.311322 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.320921 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.322088 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.322644 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.322777 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.329688 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lc7t\" (UniqueName: \"kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t\") pod \"heat-cfnapi-76bbd4596f-wxfhz\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.357575 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.417467 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.417686 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.417866 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.418012 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.418188 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lws6r\" (UniqueName: \"kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.418346 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.432496 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.436546 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.438084 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.444576 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.447164 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.447089 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lws6r\" (UniqueName: \"kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r\") pod \"heat-api-7b4b74494b-wgmr2\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.550443 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.572796 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.635647 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:49 crc kubenswrapper[4912]: I0318 13:27:49.851977 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:50 crc kubenswrapper[4912]: I0318 13:27:50.870110 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.214:5353: connect: connection refused" Mar 18 13:27:51 crc kubenswrapper[4912]: I0318 13:27:51.223812 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": dial tcp 10.217.0.215:8776: connect: connection refused" Mar 18 13:27:51 crc kubenswrapper[4912]: I0318 13:27:51.518632 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4p6pn" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" containerID="cri-o://454c7be19d73679c256628d13e874e24bb5eeb28e155e4562045b0c2c18d55b5" gracePeriod=2 Mar 18 13:27:51 crc kubenswrapper[4912]: I0318 13:27:51.936468 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:27:51 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:27:51 crc kubenswrapper[4912]: > Mar 18 13:27:52 crc kubenswrapper[4912]: I0318 13:27:52.552588 4912 generic.go:334] "Generic (PLEG): container finished" podID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerID="454c7be19d73679c256628d13e874e24bb5eeb28e155e4562045b0c2c18d55b5" exitCode=0 Mar 18 13:27:52 crc kubenswrapper[4912]: I0318 13:27:52.553350 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerDied","Data":"454c7be19d73679c256628d13e874e24bb5eeb28e155e4562045b0c2c18d55b5"} Mar 18 13:27:55 crc kubenswrapper[4912]: I0318 13:27:55.143550 4912 scope.go:117] "RemoveContainer" containerID="8bfe9b0ce31606171c776b87ff85fb45661e3ca780a21a8abe4ed0eedfd5ee11" Mar 18 13:27:55 crc kubenswrapper[4912]: I0318 13:27:55.873422 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.214:5353: connect: connection refused" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.126572 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.787547 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.796632 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.802848 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"299eab28-d9b3-4b6d-88d7-358f2c15fd2d","Type":"ContainerDied","Data":"019db5d1732d6f75d86c430ec9d4f2e82043f07ea3c7af7a752f0ae61df9db2c"} Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.802907 4912 scope.go:117] "RemoveContainer" containerID="7b5dfd0437ff56f3ad7e0ec7e039c3c6789dfbbc41231f5883c5919f177d2635" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.833693 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6f768dfb8d-bq88g" event={"ID":"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2","Type":"ContainerStarted","Data":"3eaf5fbf02cb187329278a4b02ded18215172bac36be890564d931117cd5234a"} Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.906163 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929087 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rt6j\" (UniqueName: \"kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929207 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929245 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929397 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929430 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929555 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.929704 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom\") pod \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\" (UID: \"299eab28-d9b3-4b6d-88d7-358f2c15fd2d\") " Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.932741 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.937011 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs" (OuterVolumeSpecName: "logs") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:56 crc kubenswrapper[4912]: I0318 13:27:56.961614 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034389 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzhqr\" (UniqueName: \"kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034552 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034624 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities\") pod \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034657 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034708 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rq46\" (UniqueName: \"kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46\") pod \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.034849 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.035056 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.035098 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb\") pod \"1331546f-e949-4d01-97fa-48a28a165bec\" (UID: \"1331546f-e949-4d01-97fa-48a28a165bec\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.035162 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content\") pod \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\" (UID: \"e6d9cf46-afab-44ff-b703-5c55afdcc2d2\") " Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.035986 4912 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.036007 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.045815 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities" (OuterVolumeSpecName: "utilities") pod "e6d9cf46-afab-44ff-b703-5c55afdcc2d2" (UID: "e6d9cf46-afab-44ff-b703-5c55afdcc2d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.075770 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts" (OuterVolumeSpecName: "scripts") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.077494 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j" (OuterVolumeSpecName: "kube-api-access-9rt6j") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "kube-api-access-9rt6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.080935 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.085070 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.097201 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46" (OuterVolumeSpecName: "kube-api-access-6rq46") pod "e6d9cf46-afab-44ff-b703-5c55afdcc2d2" (UID: "e6d9cf46-afab-44ff-b703-5c55afdcc2d2"). InnerVolumeSpecName "kube-api-access-6rq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.097352 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr" (OuterVolumeSpecName: "kube-api-access-fzhqr") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "kube-api-access-fzhqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.132444 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6d9cf46-afab-44ff-b703-5c55afdcc2d2" (UID: "e6d9cf46-afab-44ff-b703-5c55afdcc2d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154544 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154579 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rq46\" (UniqueName: \"kubernetes.io/projected/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-kube-api-access-6rq46\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154592 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154620 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d9cf46-afab-44ff-b703-5c55afdcc2d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154631 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rt6j\" (UniqueName: \"kubernetes.io/projected/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-kube-api-access-9rt6j\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154640 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.154650 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzhqr\" (UniqueName: \"kubernetes.io/projected/1331546f-e949-4d01-97fa-48a28a165bec-kube-api-access-fzhqr\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:57 crc kubenswrapper[4912]: W0318 13:27:57.178348 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de3a595_b691_490f_961d_e0471af1f517.slice/crio-05e837a51c9e30ec3887f672ec245a4e74bcf9d0b13350dc5bf54206b449db04 WatchSource:0}: Error finding container 05e837a51c9e30ec3887f672ec245a4e74bcf9d0b13350dc5bf54206b449db04: Status 404 returned error can't find the container with id 05e837a51c9e30ec3887f672ec245a4e74bcf9d0b13350dc5bf54206b449db04 Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.218266 4912 scope.go:117] "RemoveContainer" containerID="614faa5191b340f6797d0038f10cb9d79eae2eb421c3f233a8a40dabbb8273d9" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.277514 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.482437 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:27:57 crc kubenswrapper[4912]: W0318 13:27:57.494132 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02cb4335_ba8d_434d_b6fe_047b87453890.slice/crio-a10b010ed7d1d7a5ca046e5f19594738611f55cca9ddd18a34cdba06720b023d WatchSource:0}: Error finding container a10b010ed7d1d7a5ca046e5f19594738611f55cca9ddd18a34cdba06720b023d: Status 404 returned error can't find the container with id a10b010ed7d1d7a5ca046e5f19594738611f55cca9ddd18a34cdba06720b023d Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.863108 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8prkm" event={"ID":"72532937-9ae3-416d-968f-6a7031ec3055","Type":"ContainerStarted","Data":"e3aade8978425c41d2d64020c6223165d8a91513fcc8254ddf345c604fc65a17"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.872942 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.878173 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684f5dccdc-5k4b9" event={"ID":"63512997-1801-4665-9f60-91d912dc57e8","Type":"ContainerStarted","Data":"6d85c67e67fe1e8c98a908e4516ac993d252357f55fb8cb607597eab37d148ef"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.894921 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-8prkm" podStartSLOduration=16.894892531 podStartE2EDuration="16.894892531s" podCreationTimestamp="2026-03-18 13:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:57.888810077 +0000 UTC m=+1526.348237502" watchObservedRunningTime="2026-03-18 13:27:57.894892531 +0000 UTC m=+1526.354319976" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.902811 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p6pn" event={"ID":"e6d9cf46-afab-44ff-b703-5c55afdcc2d2","Type":"ContainerDied","Data":"a0d2cdcb2f796d7996edb4ff676b64f8f42a029542fe24cea41b68690c3fbfcb"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.902965 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p6pn" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.915514 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6459544-dzg22" event={"ID":"6de3a595-b691-490f-961d-e0471af1f517","Type":"ContainerStarted","Data":"05e837a51c9e30ec3887f672ec245a4e74bcf9d0b13350dc5bf54206b449db04"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.922667 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" event={"ID":"02cb4335-ba8d-434d-b6fe-047b87453890","Type":"ContainerStarted","Data":"a10b010ed7d1d7a5ca046e5f19594738611f55cca9ddd18a34cdba06720b023d"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.926607 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" event={"ID":"1331546f-e949-4d01-97fa-48a28a165bec","Type":"ContainerDied","Data":"544dd678336f5493c183f70672d37606d21ed7fcb34a56bc46d75f3382e56fcb"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.926756 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7z4mf" Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.930212 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" event={"ID":"7499c8bd-8342-41ca-933a-d0975f9d18e5","Type":"ContainerStarted","Data":"a5f37e41a59cddebc7ea3a8d97862a13f1897831c952b7dc81090e18672b682a"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.937980 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" event={"ID":"cc0d3692-3e21-4b00-9629-f5a4d2140ca9","Type":"ContainerStarted","Data":"edd91f2375d121cb13ef2d7ab1ae9b08383f08124083958323e19244892bae7f"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.950265 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f59b977c9-rwwx4" event={"ID":"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae","Type":"ContainerStarted","Data":"27a45b9f5f95b687d4c0ec5cd44664a550540949c0fe3be491578c7d51fd726e"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.952857 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4b74494b-wgmr2" event={"ID":"47d7c152-dc8f-4406-acbf-e5af6902d651","Type":"ContainerStarted","Data":"23e422b622c8513636bc38048544521e2c7e9b6d6b3a0c8f975c22dd57ca09cd"} Mar 18 13:27:57 crc kubenswrapper[4912]: I0318 13:27:57.968205 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" podStartSLOduration=15.968170848 podStartE2EDuration="15.968170848s" podCreationTimestamp="2026-03-18 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:57.958540099 +0000 UTC m=+1526.417967524" watchObservedRunningTime="2026-03-18 13:27:57.968170848 +0000 UTC m=+1526.427598283" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.007356 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" podStartSLOduration=16.007323928 podStartE2EDuration="16.007323928s" podCreationTimestamp="2026-03-18 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:27:57.982333098 +0000 UTC m=+1526.441760543" watchObservedRunningTime="2026-03-18 13:27:58.007323928 +0000 UTC m=+1526.466751353" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.420220 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.442149 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.442844 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.442878 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.454966 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data" (OuterVolumeSpecName: "config-data") pod "299eab28-d9b3-4b6d-88d7-358f2c15fd2d" (UID: "299eab28-d9b3-4b6d-88d7-358f2c15fd2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.496087 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.498993 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config" (OuterVolumeSpecName: "config") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.513919 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.526020 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1331546f-e949-4d01-97fa-48a28a165bec" (UID: "1331546f-e949-4d01-97fa-48a28a165bec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.546581 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299eab28-d9b3-4b6d-88d7-358f2c15fd2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.546622 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.546633 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.546648 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.547158 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1331546f-e949-4d01-97fa-48a28a165bec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.968480 4912 generic.go:334] "Generic (PLEG): container finished" podID="2d6235cc-b662-414c-97c5-0f5b3550d605" containerID="1e2416fab11d3ffe5be436544ad5826d6ece7509a24a2343bf598e243dfca294" exitCode=0 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.972254 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7bf7bf4c9b-55dtz" podUID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" containerName="heat-api" containerID="cri-o://149b608e045aa0e749331485c8e1644cc11ffdf5bcce06529685eb238800b48e" gracePeriod=60 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.977586 4912 generic.go:334] "Generic (PLEG): container finished" podID="e1108851-b127-4ccb-8c81-bfbe9de7267e" containerID="58e70d147050f31eaf9672b997b4566c4df939fd9d9b936940889fab2e4b6faa" exitCode=0 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.983949 4912 generic.go:334] "Generic (PLEG): container finished" podID="cc0d3692-3e21-4b00-9629-f5a4d2140ca9" containerID="edd91f2375d121cb13ef2d7ab1ae9b08383f08124083958323e19244892bae7f" exitCode=0 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.986649 4912 generic.go:334] "Generic (PLEG): container finished" podID="72532937-9ae3-416d-968f-6a7031ec3055" containerID="e3aade8978425c41d2d64020c6223165d8a91513fcc8254ddf345c604fc65a17" exitCode=0 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.988754 4912 generic.go:334] "Generic (PLEG): container finished" podID="ba0303c0-2058-42de-850b-ea7214f3900c" containerID="85e163b60ca91061e5b8642afb4b05df30eea078a0dc7303cc5d26b06a7e91e5" exitCode=0 Mar 18 13:27:58 crc kubenswrapper[4912]: I0318 13:27:58.991278 4912 generic.go:334] "Generic (PLEG): container finished" podID="7499c8bd-8342-41ca-933a-d0975f9d18e5" containerID="a5f37e41a59cddebc7ea3a8d97862a13f1897831c952b7dc81090e18672b682a" exitCode=0 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008655 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008694 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e151-account-create-update-9552s" event={"ID":"2d6235cc-b662-414c-97c5-0f5b3550d605","Type":"ContainerDied","Data":"1e2416fab11d3ffe5be436544ad5826d6ece7509a24a2343bf598e243dfca294"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008730 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008743 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7bf4c9b-55dtz" event={"ID":"cbda5023-7a4c-4e65-951e-545c8dc7ec49","Type":"ContainerStarted","Data":"149b608e045aa0e749331485c8e1644cc11ffdf5bcce06529685eb238800b48e"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008757 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mpxrh" event={"ID":"e1108851-b127-4ccb-8c81-bfbe9de7267e","Type":"ContainerDied","Data":"58e70d147050f31eaf9672b997b4566c4df939fd9d9b936940889fab2e4b6faa"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008775 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" event={"ID":"cc0d3692-3e21-4b00-9629-f5a4d2140ca9","Type":"ContainerDied","Data":"edd91f2375d121cb13ef2d7ab1ae9b08383f08124083958323e19244892bae7f"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008793 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8prkm" event={"ID":"72532937-9ae3-416d-968f-6a7031ec3055","Type":"ContainerDied","Data":"e3aade8978425c41d2d64020c6223165d8a91513fcc8254ddf345c604fc65a17"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008808 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jhw4z" event={"ID":"ba0303c0-2058-42de-850b-ea7214f3900c","Type":"ContainerDied","Data":"85e163b60ca91061e5b8642afb4b05df30eea078a0dc7303cc5d26b06a7e91e5"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.008825 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" event={"ID":"7499c8bd-8342-41ca-933a-d0975f9d18e5","Type":"ContainerDied","Data":"a5f37e41a59cddebc7ea3a8d97862a13f1897831c952b7dc81090e18672b682a"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.013081 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" event={"ID":"7f1211a1-9d4a-474f-a2c1-f5f8777e5733","Type":"ContainerStarted","Data":"913bb44d4052aba9870f2b79b1256385206901513890336d451f0da152abd402"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.013370 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" podUID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" containerName="heat-cfnapi" containerID="cri-o://913bb44d4052aba9870f2b79b1256385206901513890336d451f0da152abd402" gracePeriod=60 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.013545 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.042399 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerStarted","Data":"1d1f88ca21af884982b6d8a513e8b55535ca2f8e192a6ff28433cb0165ba052f"} Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.042668 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-central-agent" containerID="cri-o://b60fb01193de9b6c4e664548c173dfcafda38de6a6fe99127d17dff1020e669c" gracePeriod=30 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.043049 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.043109 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="proxy-httpd" containerID="cri-o://1d1f88ca21af884982b6d8a513e8b55535ca2f8e192a6ff28433cb0165ba052f" gracePeriod=30 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.043176 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="sg-core" containerID="cri-o://79cf2123c3fb6756becda720c0fad650ca043f1d0fb4eedde3f6548ccc9dc910" gracePeriod=30 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.043231 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-notification-agent" containerID="cri-o://437dbc9666d9aa473bc31e046629372456abdf86c3ac6d228b5edbbdc6615445" gracePeriod=30 Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.059402 4912 scope.go:117] "RemoveContainer" containerID="454c7be19d73679c256628d13e874e24bb5eeb28e155e4562045b0c2c18d55b5" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.155063 4912 scope.go:117] "RemoveContainer" containerID="0b0d640751d2bc787e4ace3b0fdf3dfd60f52d033fee6c303d17b93c5acdfa11" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.191071 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7bf7bf4c9b-55dtz" podStartSLOduration=6.535750517 podStartE2EDuration="22.1910308s" podCreationTimestamp="2026-03-18 13:27:37 +0000 UTC" firstStartedPulling="2026-03-18 13:27:39.489439812 +0000 UTC m=+1507.948867237" lastFinishedPulling="2026-03-18 13:27:55.144720105 +0000 UTC m=+1523.604147520" observedRunningTime="2026-03-18 13:27:59.146594827 +0000 UTC m=+1527.606022262" watchObservedRunningTime="2026-03-18 13:27:59.1910308 +0000 UTC m=+1527.650458225" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.248433 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.265175 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.282761 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.314250 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7z4mf"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.330966 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331757 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331778 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331818 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api-log" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331828 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api-log" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331838 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="extract-content" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331845 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="extract-content" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331875 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331880 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331900 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="init" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331909 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="init" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331917 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331924 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" Mar 18 13:27:59 crc kubenswrapper[4912]: E0318 13:27:59.331934 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="extract-utilities" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.331940 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="extract-utilities" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.332218 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" containerName="registry-server" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.332242 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1331546f-e949-4d01-97fa-48a28a165bec" containerName="dnsmasq-dns" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.332255 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api-log" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.332274 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.334450 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.343426 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.343586 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.343768 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.354651 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.377292 4912 scope.go:117] "RemoveContainer" containerID="13c7bc06c22b4583634716708c3c968c5262308701cb89a628ba2c88628f21f4" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379345 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtcqt\" (UniqueName: \"kubernetes.io/projected/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-kube-api-access-vtcqt\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379437 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data-custom\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379488 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379589 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-scripts\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.379783 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-logs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.380092 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.380338 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.380510 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.391438 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.470187967 podStartE2EDuration="23.391407328s" podCreationTimestamp="2026-03-18 13:27:36 +0000 UTC" firstStartedPulling="2026-03-18 13:27:38.183261965 +0000 UTC m=+1506.642689390" lastFinishedPulling="2026-03-18 13:27:56.104481326 +0000 UTC m=+1524.563908751" observedRunningTime="2026-03-18 13:27:59.260298159 +0000 UTC m=+1527.719725584" watchObservedRunningTime="2026-03-18 13:27:59.391407328 +0000 UTC m=+1527.850834753" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.457494 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483051 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-scripts\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483188 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4p6pn"] Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483292 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483410 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-logs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483732 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483849 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483916 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.483983 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.484011 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcqt\" (UniqueName: \"kubernetes.io/projected/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-kube-api-access-vtcqt\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.484119 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data-custom\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.484304 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-logs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.484397 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.492301 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" podStartSLOduration=6.695304039 podStartE2EDuration="22.492266775s" podCreationTimestamp="2026-03-18 13:27:37 +0000 UTC" firstStartedPulling="2026-03-18 13:27:39.56833417 +0000 UTC m=+1508.027761595" lastFinishedPulling="2026-03-18 13:27:55.365296906 +0000 UTC m=+1523.824724331" observedRunningTime="2026-03-18 13:27:59.347470249 +0000 UTC m=+1527.806897694" watchObservedRunningTime="2026-03-18 13:27:59.492266775 +0000 UTC m=+1527.951694200" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.494184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.494558 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.495313 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data-custom\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.505809 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-scripts\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.508863 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.509983 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-config-data\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.540573 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtcqt\" (UniqueName: \"kubernetes.io/projected/f74a682c-ca05-498c-ab11-4ccf3d7d3b46-kube-api-access-vtcqt\") pod \"cinder-api-0\" (UID: \"f74a682c-ca05-498c-ab11-4ccf3d7d3b46\") " pod="openstack/cinder-api-0" Mar 18 13:27:59 crc kubenswrapper[4912]: I0318 13:27:59.808965 4912 scope.go:117] "RemoveContainer" containerID="a162166d3edc118ef7452a18bc245f968a9ef84696a13788ba367fa2487f1f5c" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.093533 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f59b977c9-rwwx4" event={"ID":"08a4effe-9a7e-449c-aba4-74d4b7a4f0ae","Type":"ContainerStarted","Data":"b708f400f6041379960e558a26a4fde52d134b48313f81007c0c0f8208bae1e8"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.093924 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.093961 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.155296 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f59b977c9-rwwx4" podStartSLOduration=20.15527038 podStartE2EDuration="20.15527038s" podCreationTimestamp="2026-03-18 13:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:00.136967118 +0000 UTC m=+1528.596394563" watchObservedRunningTime="2026-03-18 13:28:00.15527038 +0000 UTC m=+1528.614697805" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169120 4912 generic.go:334] "Generic (PLEG): container finished" podID="40cdba8a-2602-427c-a95c-65edb257c369" containerID="1d1f88ca21af884982b6d8a513e8b55535ca2f8e192a6ff28433cb0165ba052f" exitCode=0 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169177 4912 generic.go:334] "Generic (PLEG): container finished" podID="40cdba8a-2602-427c-a95c-65edb257c369" containerID="79cf2123c3fb6756becda720c0fad650ca043f1d0fb4eedde3f6548ccc9dc910" exitCode=2 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169189 4912 generic.go:334] "Generic (PLEG): container finished" podID="40cdba8a-2602-427c-a95c-65edb257c369" containerID="437dbc9666d9aa473bc31e046629372456abdf86c3ac6d228b5edbbdc6615445" exitCode=0 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169200 4912 generic.go:334] "Generic (PLEG): container finished" podID="40cdba8a-2602-427c-a95c-65edb257c369" containerID="b60fb01193de9b6c4e664548c173dfcafda38de6a6fe99127d17dff1020e669c" exitCode=0 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169299 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerDied","Data":"1d1f88ca21af884982b6d8a513e8b55535ca2f8e192a6ff28433cb0165ba052f"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169340 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerDied","Data":"79cf2123c3fb6756becda720c0fad650ca043f1d0fb4eedde3f6548ccc9dc910"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169356 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerDied","Data":"437dbc9666d9aa473bc31e046629372456abdf86c3ac6d228b5edbbdc6615445"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.169369 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerDied","Data":"b60fb01193de9b6c4e664548c173dfcafda38de6a6fe99127d17dff1020e669c"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.213279 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564008-fqc26"] Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.215112 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.233996 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.234298 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.234470 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.264740 4912 generic.go:334] "Generic (PLEG): container finished" podID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" containerID="149b608e045aa0e749331485c8e1644cc11ffdf5bcce06529685eb238800b48e" exitCode=0 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.276986 4912 generic.go:334] "Generic (PLEG): container finished" podID="6de3a595-b691-490f-961d-e0471af1f517" containerID="f28a3228091a4d71f62cf2267cec05a078eae35658dee10d9b798ad168b7bdc7" exitCode=1 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.278054 4912 scope.go:117] "RemoveContainer" containerID="f28a3228091a4d71f62cf2267cec05a078eae35658dee10d9b798ad168b7bdc7" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.307792 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1331546f-e949-4d01-97fa-48a28a165bec" path="/var/lib/kubelet/pods/1331546f-e949-4d01-97fa-48a28a165bec/volumes" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.328978 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" path="/var/lib/kubelet/pods/299eab28-d9b3-4b6d-88d7-358f2c15fd2d/volumes" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.330728 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d9cf46-afab-44ff-b703-5c55afdcc2d2" path="/var/lib/kubelet/pods/e6d9cf46-afab-44ff-b703-5c55afdcc2d2/volumes" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332026 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332106 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332123 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7bf4c9b-55dtz" event={"ID":"cbda5023-7a4c-4e65-951e-545c8dc7ec49","Type":"ContainerDied","Data":"149b608e045aa0e749331485c8e1644cc11ffdf5bcce06529685eb238800b48e"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332160 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"311d61bd-9241-486c-a8d5-22fc93f208bc","Type":"ContainerStarted","Data":"dd0fb60e48df3b4714e789ada9f8e489ef28823c70c7b7fbc25e1094e415576c"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6459544-dzg22" event={"ID":"6de3a595-b691-490f-961d-e0471af1f517","Type":"ContainerDied","Data":"f28a3228091a4d71f62cf2267cec05a078eae35658dee10d9b798ad168b7bdc7"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.332199 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4b74494b-wgmr2" event={"ID":"47d7c152-dc8f-4406-acbf-e5af6902d651","Type":"ContainerStarted","Data":"51c4a7ce9f97958dc39024e55b9c6f3c087a3e537706719be3968d3a3420fe9b"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.337235 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-fqc26"] Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.349882 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" event={"ID":"02cb4335-ba8d-434d-b6fe-047b87453890","Type":"ContainerStarted","Data":"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.351630 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.355719 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9dk\" (UniqueName: \"kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk\") pod \"auto-csr-approver-29564008-fqc26\" (UID: \"284c4dc7-ec24-4baa-91e7-f0540ed73054\") " pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.357591 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.365751 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=5.96181689 podStartE2EDuration="31.365710598s" podCreationTimestamp="2026-03-18 13:27:29 +0000 UTC" firstStartedPulling="2026-03-18 13:27:30.358243869 +0000 UTC m=+1498.817671284" lastFinishedPulling="2026-03-18 13:27:55.762137557 +0000 UTC m=+1524.221564992" observedRunningTime="2026-03-18 13:28:00.312240643 +0000 UTC m=+1528.771668068" watchObservedRunningTime="2026-03-18 13:28:00.365710598 +0000 UTC m=+1528.825138033" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.371474 4912 scope.go:117] "RemoveContainer" containerID="d7c6ba00564984f8a06283f3c97631907fde972d87d8ec35fea9d99f60b41ab9" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.394020 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-684f5dccdc-5k4b9" podStartSLOduration=16.393990337 podStartE2EDuration="16.393990337s" podCreationTimestamp="2026-03-18 13:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:00.346752019 +0000 UTC m=+1528.806179454" watchObservedRunningTime="2026-03-18 13:28:00.393990337 +0000 UTC m=+1528.853417762" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.418989 4912 generic.go:334] "Generic (PLEG): container finished" podID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" containerID="913bb44d4052aba9870f2b79b1256385206901513890336d451f0da152abd402" exitCode=0 Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.419336 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" event={"ID":"7f1211a1-9d4a-474f-a2c1-f5f8777e5733","Type":"ContainerDied","Data":"913bb44d4052aba9870f2b79b1256385206901513890336d451f0da152abd402"} Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.456355 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7b4b74494b-wgmr2" podStartSLOduration=11.456266178 podStartE2EDuration="11.456266178s" podCreationTimestamp="2026-03-18 13:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:00.423190431 +0000 UTC m=+1528.882617856" watchObservedRunningTime="2026-03-18 13:28:00.456266178 +0000 UTC m=+1528.915693603" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.458478 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9dk\" (UniqueName: \"kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk\") pod \"auto-csr-approver-29564008-fqc26\" (UID: \"284c4dc7-ec24-4baa-91e7-f0540ed73054\") " pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.483409 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" podStartSLOduration=12.483379746 podStartE2EDuration="12.483379746s" podCreationTimestamp="2026-03-18 13:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:00.443490306 +0000 UTC m=+1528.902917741" watchObservedRunningTime="2026-03-18 13:28:00.483379746 +0000 UTC m=+1528.942807171" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.500811 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9dk\" (UniqueName: \"kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk\") pod \"auto-csr-approver-29564008-fqc26\" (UID: \"284c4dc7-ec24-4baa-91e7-f0540ed73054\") " pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.524194 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.682172 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq986\" (UniqueName: \"kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986\") pod \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.682902 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom\") pod \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.683109 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data\") pod \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.683164 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle\") pod \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\" (UID: \"7f1211a1-9d4a-474f-a2c1-f5f8777e5733\") " Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.707482 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986" (OuterVolumeSpecName: "kube-api-access-lq986") pod "7f1211a1-9d4a-474f-a2c1-f5f8777e5733" (UID: "7f1211a1-9d4a-474f-a2c1-f5f8777e5733"). InnerVolumeSpecName "kube-api-access-lq986". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.714302 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.719316 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f1211a1-9d4a-474f-a2c1-f5f8777e5733" (UID: "7f1211a1-9d4a-474f-a2c1-f5f8777e5733"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.790805 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq986\" (UniqueName: \"kubernetes.io/projected/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-kube-api-access-lq986\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.790864 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.855862 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1211a1-9d4a-474f-a2c1-f5f8777e5733" (UID: "7f1211a1-9d4a-474f-a2c1-f5f8777e5733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.894990 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.930768 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data" (OuterVolumeSpecName: "config-data") pod "7f1211a1-9d4a-474f-a2c1-f5f8777e5733" (UID: "7f1211a1-9d4a-474f-a2c1-f5f8777e5733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:00 crc kubenswrapper[4912]: I0318 13:28:00.997959 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1211a1-9d4a-474f-a2c1-f5f8777e5733-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.033555 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.052171 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.099720 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wc68\" (UniqueName: \"kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68\") pod \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.099793 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle\") pod \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.099865 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom\") pod \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.100448 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data\") pod \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\" (UID: \"cbda5023-7a4c-4e65-951e-545c8dc7ec49\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.106615 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68" (OuterVolumeSpecName: "kube-api-access-8wc68") pod "cbda5023-7a4c-4e65-951e-545c8dc7ec49" (UID: "cbda5023-7a4c-4e65-951e-545c8dc7ec49"). InnerVolumeSpecName "kube-api-access-8wc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.134137 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbda5023-7a4c-4e65-951e-545c8dc7ec49" (UID: "cbda5023-7a4c-4e65-951e-545c8dc7ec49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.177184 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbda5023-7a4c-4e65-951e-545c8dc7ec49" (UID: "cbda5023-7a4c-4e65-951e-545c8dc7ec49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.202978 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203106 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsm6f\" (UniqueName: \"kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203353 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203518 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203597 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203727 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203788 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.203823 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle\") pod \"40cdba8a-2602-427c-a95c-65edb257c369\" (UID: \"40cdba8a-2602-427c-a95c-65edb257c369\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.204388 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.205074 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.205123 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wc68\" (UniqueName: \"kubernetes.io/projected/cbda5023-7a4c-4e65-951e-545c8dc7ec49-kube-api-access-8wc68\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.205139 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.205151 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.205161 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40cdba8a-2602-427c-a95c-65edb257c369-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.218906 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f" (OuterVolumeSpecName: "kube-api-access-hsm6f") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "kube-api-access-hsm6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.223616 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts" (OuterVolumeSpecName: "scripts") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.224339 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="299eab28-d9b3-4b6d-88d7-358f2c15fd2d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.239326 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data" (OuterVolumeSpecName: "config-data") pod "cbda5023-7a4c-4e65-951e-545c8dc7ec49" (UID: "cbda5023-7a4c-4e65-951e-545c8dc7ec49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.297222 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.315553 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda5023-7a4c-4e65-951e-545c8dc7ec49-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.315592 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.315603 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsm6f\" (UniqueName: \"kubernetes.io/projected/40cdba8a-2602-427c-a95c-65edb257c369-kube-api-access-hsm6f\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.315618 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.364737 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.410242 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data" (OuterVolumeSpecName: "config-data") pod "40cdba8a-2602-427c-a95c-65edb257c369" (UID: "40cdba8a-2602-427c-a95c-65edb257c369"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.421308 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.421353 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cdba8a-2602-427c-a95c-65edb257c369-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.448475 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.457624 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7bf4c9b-55dtz" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.458147 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7bf4c9b-55dtz" event={"ID":"cbda5023-7a4c-4e65-951e-545c8dc7ec49","Type":"ContainerDied","Data":"79c2c8a89de99574d3470e3d25ca74e83573a4bb52d824ed6e55bedc69479221"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.458313 4912 scope.go:117] "RemoveContainer" containerID="149b608e045aa0e749331485c8e1644cc11ffdf5bcce06529685eb238800b48e" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.462409 4912 generic.go:334] "Generic (PLEG): container finished" podID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerID="24233139f19657729753d62c6170756b06f1661857b14468dfac1560bb4286e0" exitCode=1 Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.462496 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6f768dfb8d-bq88g" event={"ID":"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2","Type":"ContainerDied","Data":"24233139f19657729753d62c6170756b06f1661857b14468dfac1560bb4286e0"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.463378 4912 scope.go:117] "RemoveContainer" containerID="24233139f19657729753d62c6170756b06f1661857b14468dfac1560bb4286e0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.475258 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.479659 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.481100 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-648f5d994b-9xb5t" event={"ID":"7f1211a1-9d4a-474f-a2c1-f5f8777e5733","Type":"ContainerDied","Data":"b895c91c3ee0fc28c3d355983f5242c3af0225a7c8274a815083f6e81bdba1ee"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.492508 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684f5dccdc-5k4b9" event={"ID":"63512997-1801-4665-9f60-91d912dc57e8","Type":"ContainerStarted","Data":"2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.510601 4912 scope.go:117] "RemoveContainer" containerID="913bb44d4052aba9870f2b79b1256385206901513890336d451f0da152abd402" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.516800 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40cdba8a-2602-427c-a95c-65edb257c369","Type":"ContainerDied","Data":"571e5f754fd76954bdaf1f9507cb3d3243e842bb770502bc4833c2934d8c720d"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.516891 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.531199 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84nn\" (UniqueName: \"kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn\") pod \"7499c8bd-8342-41ca-933a-d0975f9d18e5\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.531450 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts\") pod \"7499c8bd-8342-41ca-933a-d0975f9d18e5\" (UID: \"7499c8bd-8342-41ca-933a-d0975f9d18e5\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.532276 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7499c8bd-8342-41ca-933a-d0975f9d18e5" (UID: "7499c8bd-8342-41ca-933a-d0975f9d18e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.533915 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7499c8bd-8342-41ca-933a-d0975f9d18e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.546408 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn" (OuterVolumeSpecName: "kube-api-access-x84nn") pod "7499c8bd-8342-41ca-933a-d0975f9d18e5" (UID: "7499c8bd-8342-41ca-933a-d0975f9d18e5"). InnerVolumeSpecName "kube-api-access-x84nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.565350 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6459544-dzg22" event={"ID":"6de3a595-b691-490f-961d-e0471af1f517","Type":"ContainerStarted","Data":"bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc"} Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.565844 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.689982 4912 scope.go:117] "RemoveContainer" containerID="1d1f88ca21af884982b6d8a513e8b55535ca2f8e192a6ff28433cb0165ba052f" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.710441 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84nn\" (UniqueName: \"kubernetes.io/projected/7499c8bd-8342-41ca-933a-d0975f9d18e5-kube-api-access-x84nn\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.777980 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.795653 4912 scope.go:117] "RemoveContainer" containerID="79cf2123c3fb6756becda720c0fad650ca043f1d0fb4eedde3f6548ccc9dc910" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.812391 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7bf7bf4c9b-55dtz"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.818352 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4fk\" (UniqueName: \"kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk\") pod \"e1108851-b127-4ccb-8c81-bfbe9de7267e\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.818731 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts\") pod \"e1108851-b127-4ccb-8c81-bfbe9de7267e\" (UID: \"e1108851-b127-4ccb-8c81-bfbe9de7267e\") " Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.822383 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1108851-b127-4ccb-8c81-bfbe9de7267e" (UID: "e1108851-b127-4ccb-8c81-bfbe9de7267e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.835708 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk" (OuterVolumeSpecName: "kube-api-access-qd4fk") pod "e1108851-b127-4ccb-8c81-bfbe9de7267e" (UID: "e1108851-b127-4ccb-8c81-bfbe9de7267e"). InnerVolumeSpecName "kube-api-access-qd4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.843506 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.853335 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.893760 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.894624 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="proxy-httpd" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.894652 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="proxy-httpd" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895150 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="sg-core" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895166 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="sg-core" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895180 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7499c8bd-8342-41ca-933a-d0975f9d18e5" containerName="mariadb-account-create-update" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895187 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7499c8bd-8342-41ca-933a-d0975f9d18e5" containerName="mariadb-account-create-update" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895201 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-central-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895209 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-central-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895222 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" containerName="heat-cfnapi" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895229 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" containerName="heat-cfnapi" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895241 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1108851-b127-4ccb-8c81-bfbe9de7267e" containerName="mariadb-database-create" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895249 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1108851-b127-4ccb-8c81-bfbe9de7267e" containerName="mariadb-database-create" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895267 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" containerName="heat-api" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895276 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" containerName="heat-api" Mar 18 13:28:01 crc kubenswrapper[4912]: E0318 13:28:01.895295 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-notification-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895303 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-notification-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895632 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1108851-b127-4ccb-8c81-bfbe9de7267e" containerName="mariadb-database-create" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895653 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="proxy-httpd" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895683 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="sg-core" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895698 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" containerName="heat-cfnapi" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895711 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" containerName="heat-api" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895722 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7499c8bd-8342-41ca-933a-d0975f9d18e5" containerName="mariadb-account-create-update" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895731 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-notification-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.895746 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdba8a-2602-427c-a95c-65edb257c369" containerName="ceilometer-central-agent" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.898926 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.902020 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.902637 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.912568 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.920509 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7f6459544-dzg22" podStartSLOduration=17.920475919 podStartE2EDuration="17.920475919s" podCreationTimestamp="2026-03-18 13:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:01.67463608 +0000 UTC m=+1530.134063545" watchObservedRunningTime="2026-03-18 13:28:01.920475919 +0000 UTC m=+1530.379903344" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925556 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925619 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925657 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925684 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925771 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.925927 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.926017 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctqr\" (UniqueName: \"kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.929434 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1108851-b127-4ccb-8c81-bfbe9de7267e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.929487 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4fk\" (UniqueName: \"kubernetes.io/projected/e1108851-b127-4ccb-8c81-bfbe9de7267e-kube-api-access-qd4fk\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.942674 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.963302 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-648f5d994b-9xb5t"] Mar 18 13:28:01 crc kubenswrapper[4912]: I0318 13:28:01.981441 4912 scope.go:117] "RemoveContainer" containerID="437dbc9666d9aa473bc31e046629372456abdf86c3ac6d228b5edbbdc6615445" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.023061 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:28:02 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:28:02 crc kubenswrapper[4912]: > Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032383 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032540 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctqr\" (UniqueName: \"kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032739 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032773 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032808 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032840 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.032949 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.036839 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.037833 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.056172 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.056313 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.056747 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.061611 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctqr\" (UniqueName: \"kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.061657 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.174490 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.192992 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.206462 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.217287 4912 scope.go:117] "RemoveContainer" containerID="b60fb01193de9b6c4e664548c173dfcafda38de6a6fe99127d17dff1020e669c" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.238814 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.239790 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb82d\" (UniqueName: \"kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d\") pod \"2d6235cc-b662-414c-97c5-0f5b3550d605\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.239878 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts\") pod \"2d6235cc-b662-414c-97c5-0f5b3550d605\" (UID: \"2d6235cc-b662-414c-97c5-0f5b3550d605\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.240020 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xmxh\" (UniqueName: \"kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh\") pod \"72532937-9ae3-416d-968f-6a7031ec3055\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.240099 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts\") pod \"72532937-9ae3-416d-968f-6a7031ec3055\" (UID: \"72532937-9ae3-416d-968f-6a7031ec3055\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.250149 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72532937-9ae3-416d-968f-6a7031ec3055" (UID: "72532937-9ae3-416d-968f-6a7031ec3055"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.250431 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh" (OuterVolumeSpecName: "kube-api-access-4xmxh") pod "72532937-9ae3-416d-968f-6a7031ec3055" (UID: "72532937-9ae3-416d-968f-6a7031ec3055"). InnerVolumeSpecName "kube-api-access-4xmxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.250482 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d6235cc-b662-414c-97c5-0f5b3550d605" (UID: "2d6235cc-b662-414c-97c5-0f5b3550d605"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.252148 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d" (OuterVolumeSpecName: "kube-api-access-tb82d") pod "2d6235cc-b662-414c-97c5-0f5b3550d605" (UID: "2d6235cc-b662-414c-97c5-0f5b3550d605"). InnerVolumeSpecName "kube-api-access-tb82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.274338 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.303019 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cdba8a-2602-427c-a95c-65edb257c369" path="/var/lib/kubelet/pods/40cdba8a-2602-427c-a95c-65edb257c369/volumes" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.304760 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1211a1-9d4a-474f-a2c1-f5f8777e5733" path="/var/lib/kubelet/pods/7f1211a1-9d4a-474f-a2c1-f5f8777e5733/volumes" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.307759 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbda5023-7a4c-4e65-951e-545c8dc7ec49" path="/var/lib/kubelet/pods/cbda5023-7a4c-4e65-951e-545c8dc7ec49/volumes" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.343360 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts\") pod \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.347122 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts\") pod \"ba0303c0-2058-42de-850b-ea7214f3900c\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.347528 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lkdg\" (UniqueName: \"kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg\") pod \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\" (UID: \"cc0d3692-3e21-4b00-9629-f5a4d2140ca9\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.347769 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gn8g\" (UniqueName: \"kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g\") pod \"ba0303c0-2058-42de-850b-ea7214f3900c\" (UID: \"ba0303c0-2058-42de-850b-ea7214f3900c\") " Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.343940 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc0d3692-3e21-4b00-9629-f5a4d2140ca9" (UID: "cc0d3692-3e21-4b00-9629-f5a4d2140ca9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.348079 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba0303c0-2058-42de-850b-ea7214f3900c" (UID: "ba0303c0-2058-42de-850b-ea7214f3900c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.364955 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g" (OuterVolumeSpecName: "kube-api-access-6gn8g") pod "ba0303c0-2058-42de-850b-ea7214f3900c" (UID: "ba0303c0-2058-42de-850b-ea7214f3900c"). InnerVolumeSpecName "kube-api-access-6gn8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.365417 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg" (OuterVolumeSpecName: "kube-api-access-7lkdg") pod "cc0d3692-3e21-4b00-9629-f5a4d2140ca9" (UID: "cc0d3692-3e21-4b00-9629-f5a4d2140ca9"). InnerVolumeSpecName "kube-api-access-7lkdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372786 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lkdg\" (UniqueName: \"kubernetes.io/projected/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-kube-api-access-7lkdg\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372821 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gn8g\" (UniqueName: \"kubernetes.io/projected/ba0303c0-2058-42de-850b-ea7214f3900c-kube-api-access-6gn8g\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372832 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb82d\" (UniqueName: \"kubernetes.io/projected/2d6235cc-b662-414c-97c5-0f5b3550d605-kube-api-access-tb82d\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372845 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6235cc-b662-414c-97c5-0f5b3550d605-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372856 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xmxh\" (UniqueName: \"kubernetes.io/projected/72532937-9ae3-416d-968f-6a7031ec3055-kube-api-access-4xmxh\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372865 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc0d3692-3e21-4b00-9629-f5a4d2140ca9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372874 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72532937-9ae3-416d-968f-6a7031ec3055-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.372884 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0303c0-2058-42de-850b-ea7214f3900c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.538953 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-fqc26"] Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.568736 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.609628 4912 generic.go:334] "Generic (PLEG): container finished" podID="6de3a595-b691-490f-961d-e0471af1f517" containerID="bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc" exitCode=1 Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.609736 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6459544-dzg22" event={"ID":"6de3a595-b691-490f-961d-e0471af1f517","Type":"ContainerDied","Data":"bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.609816 4912 scope.go:117] "RemoveContainer" containerID="f28a3228091a4d71f62cf2267cec05a078eae35658dee10d9b798ad168b7bdc7" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.611010 4912 scope.go:117] "RemoveContainer" containerID="bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc" Mar 18 13:28:02 crc kubenswrapper[4912]: E0318 13:28:02.611366 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7f6459544-dzg22_openstack(6de3a595-b691-490f-961d-e0471af1f517)\"" pod="openstack/heat-cfnapi-7f6459544-dzg22" podUID="6de3a595-b691-490f-961d-e0471af1f517" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.617672 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8prkm" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.617852 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8prkm" event={"ID":"72532937-9ae3-416d-968f-6a7031ec3055","Type":"ContainerDied","Data":"3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.617937 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9649e4ac4ad4ac5728d64b97691d542ac013330a494f9241ac7c124268d75f" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.649318 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e151-account-create-update-9552s" event={"ID":"2d6235cc-b662-414c-97c5-0f5b3550d605","Type":"ContainerDied","Data":"33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.649411 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33238352fd96f262db3a4689a16aa90d56117ed9f86b6ad71d79ba2fdab26a0c" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.649645 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e151-account-create-update-9552s" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.677928 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" event={"ID":"7499c8bd-8342-41ca-933a-d0975f9d18e5","Type":"ContainerDied","Data":"19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.677968 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ce1-account-create-update-b9dnq" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.677989 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19df0ea236bf44b3c4dffff920c3c994ab2f9a6f95c048cab0474e947385ef25" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.688013 4912 generic.go:334] "Generic (PLEG): container finished" podID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerID="3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8" exitCode=1 Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.688130 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6f768dfb8d-bq88g" event={"ID":"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2","Type":"ContainerDied","Data":"3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.691504 4912 scope.go:117] "RemoveContainer" containerID="3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8" Mar 18 13:28:02 crc kubenswrapper[4912]: E0318 13:28:02.692207 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6f768dfb8d-bq88g_openstack(1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2)\"" pod="openstack/heat-api-6f768dfb8d-bq88g" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.756284 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f74a682c-ca05-498c-ab11-4ccf3d7d3b46","Type":"ContainerStarted","Data":"ee84236b7177b20e5ca81ef15106cff1c31066513f045addf8ea73ebe34c6bc3"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.766370 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jhw4z" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.766551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jhw4z" event={"ID":"ba0303c0-2058-42de-850b-ea7214f3900c","Type":"ContainerDied","Data":"9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.766584 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a78053251e141dbd921e44eb2975a8384d3aca5a29515dad1713f1b1b13ff94" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.777082 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-fqc26" event={"ID":"284c4dc7-ec24-4baa-91e7-f0540ed73054","Type":"ContainerStarted","Data":"7404cfb4389e9f009a1206ffa6bf856ab699351e438a35d7280b2623572b7455"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.785736 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mpxrh" event={"ID":"e1108851-b127-4ccb-8c81-bfbe9de7267e","Type":"ContainerDied","Data":"944eea457a4d562321b02809b170da469f291eeb3e003fc50a477274463f725d"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.785821 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944eea457a4d562321b02809b170da469f291eeb3e003fc50a477274463f725d" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.785990 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mpxrh" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.801783 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.801981 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e03c-account-create-update-vtvfd" event={"ID":"cc0d3692-3e21-4b00-9629-f5a4d2140ca9","Type":"ContainerDied","Data":"e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a"} Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.802081 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5bdd26264824cb4439396912f802090475086eaa340e9f1b9922d639e18a55a" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.824185 4912 scope.go:117] "RemoveContainer" containerID="24233139f19657729753d62c6170756b06f1661857b14468dfac1560bb4286e0" Mar 18 13:28:02 crc kubenswrapper[4912]: I0318 13:28:02.978487 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:03 crc kubenswrapper[4912]: W0318 13:28:03.020239 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d1c930_c034_4bae_9226_0b277f4e7542.slice/crio-d8a71904ad02dfc3640550c1911cd7ea4326b61df4e4997979e670b75e0da239 WatchSource:0}: Error finding container d8a71904ad02dfc3640550c1911cd7ea4326b61df4e4997979e670b75e0da239: Status 404 returned error can't find the container with id d8a71904ad02dfc3640550c1911cd7ea4326b61df4e4997979e670b75e0da239 Mar 18 13:28:03 crc kubenswrapper[4912]: I0318 13:28:03.844366 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerStarted","Data":"d8a71904ad02dfc3640550c1911cd7ea4326b61df4e4997979e670b75e0da239"} Mar 18 13:28:03 crc kubenswrapper[4912]: I0318 13:28:03.846818 4912 scope.go:117] "RemoveContainer" containerID="bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc" Mar 18 13:28:03 crc kubenswrapper[4912]: E0318 13:28:03.847319 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7f6459544-dzg22_openstack(6de3a595-b691-490f-961d-e0471af1f517)\"" pod="openstack/heat-cfnapi-7f6459544-dzg22" podUID="6de3a595-b691-490f-961d-e0471af1f517" Mar 18 13:28:03 crc kubenswrapper[4912]: I0318 13:28:03.863938 4912 scope.go:117] "RemoveContainer" containerID="3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8" Mar 18 13:28:03 crc kubenswrapper[4912]: E0318 13:28:03.865387 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6f768dfb8d-bq88g_openstack(1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2)\"" pod="openstack/heat-api-6f768dfb8d-bq88g" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.051070 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.051504 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.052963 4912 scope.go:117] "RemoveContainer" containerID="3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8" Mar 18 13:28:05 crc kubenswrapper[4912]: E0318 13:28:05.053687 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6f768dfb8d-bq88g_openstack(1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2)\"" pod="openstack/heat-api-6f768dfb8d-bq88g" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.099984 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.101169 4912 scope.go:117] "RemoveContainer" containerID="bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc" Mar 18 13:28:05 crc kubenswrapper[4912]: E0318 13:28:05.101516 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7f6459544-dzg22_openstack(6de3a595-b691-490f-961d-e0471af1f517)\"" pod="openstack/heat-cfnapi-7f6459544-dzg22" podUID="6de3a595-b691-490f-961d-e0471af1f517" Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.933093 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f74a682c-ca05-498c-ab11-4ccf3d7d3b46","Type":"ContainerStarted","Data":"dfdeddb291345a6c5d4c38041ca3638829412bcdb868bb4af3be3c248a132327"} Mar 18 13:28:05 crc kubenswrapper[4912]: I0318 13:28:05.999896 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:28:06 crc kubenswrapper[4912]: I0318 13:28:06.025422 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f59b977c9-rwwx4" Mar 18 13:28:06 crc kubenswrapper[4912]: I0318 13:28:06.947593 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerStarted","Data":"2d4a17e10ed963d27017c3ee8e718f356cb8f9b68eadd72278df6e7a584d7be4"} Mar 18 13:28:06 crc kubenswrapper[4912]: I0318 13:28:06.999339 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:28:06 crc kubenswrapper[4912]: I0318 13:28:06.999433 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.069170 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.152114 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.575933 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2c6pp"] Mar 18 13:28:07 crc kubenswrapper[4912]: E0318 13:28:07.576851 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72532937-9ae3-416d-968f-6a7031ec3055" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.576886 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="72532937-9ae3-416d-968f-6a7031ec3055" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: E0318 13:28:07.576943 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6235cc-b662-414c-97c5-0f5b3550d605" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.576951 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6235cc-b662-414c-97c5-0f5b3550d605" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: E0318 13:28:07.576961 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0d3692-3e21-4b00-9629-f5a4d2140ca9" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.576970 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0d3692-3e21-4b00-9629-f5a4d2140ca9" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: E0318 13:28:07.577014 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0303c0-2058-42de-850b-ea7214f3900c" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.577024 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0303c0-2058-42de-850b-ea7214f3900c" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.577372 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6235cc-b662-414c-97c5-0f5b3550d605" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.577417 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0303c0-2058-42de-850b-ea7214f3900c" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.577437 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0d3692-3e21-4b00-9629-f5a4d2140ca9" containerName="mariadb-account-create-update" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.577452 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="72532937-9ae3-416d-968f-6a7031ec3055" containerName="mariadb-database-create" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.584283 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.588954 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bjjnw" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.590305 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.590539 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.594900 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2c6pp"] Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.697775 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.698964 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.699047 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw24s\" (UniqueName: \"kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.699525 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.802439 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.802630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.802772 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.802806 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw24s\" (UniqueName: \"kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.814009 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.817555 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.832951 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw24s\" (UniqueName: \"kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.835229 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data\") pod \"nova-cell0-conductor-db-sync-2c6pp\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:07 crc kubenswrapper[4912]: I0318 13:28:07.926462 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.032660 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6f768dfb8d-bq88g" event={"ID":"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2","Type":"ContainerDied","Data":"3eaf5fbf02cb187329278a4b02ded18215172bac36be890564d931117cd5234a"} Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.032721 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eaf5fbf02cb187329278a4b02ded18215172bac36be890564d931117cd5234a" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.049126 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.050347 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f74a682c-ca05-498c-ab11-4ccf3d7d3b46","Type":"ContainerStarted","Data":"b6b6a9296f9e34ab0ae077e283f36190cea280349fecd1329a2db81e495ae177"} Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.051967 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.088857 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerStarted","Data":"155b9fc578f4fd0269822b4f554166e106407f820bf5df84871e0eb0540ed4e5"} Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.103410 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-fqc26" event={"ID":"284c4dc7-ec24-4baa-91e7-f0540ed73054","Type":"ContainerStarted","Data":"d123da7fc856b3287879b131319b8b60419c82b22c02106327aba5b7ab445734"} Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.118872 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg7gt\" (UniqueName: \"kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt\") pod \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.119348 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle\") pod \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.119601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data\") pod \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.119751 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom\") pod \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\" (UID: \"1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2\") " Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.127720 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt" (OuterVolumeSpecName: "kube-api-access-fg7gt") pod "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" (UID: "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2"). InnerVolumeSpecName "kube-api-access-fg7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.195341 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.195310517 podStartE2EDuration="9.195310517s" podCreationTimestamp="2026-03-18 13:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:08.128598476 +0000 UTC m=+1536.588025921" watchObservedRunningTime="2026-03-18 13:28:08.195310517 +0000 UTC m=+1536.654737942" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.207737 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564008-fqc26" podStartSLOduration=5.277355598 podStartE2EDuration="8.207635858s" podCreationTimestamp="2026-03-18 13:28:00 +0000 UTC" firstStartedPulling="2026-03-18 13:28:02.53824748 +0000 UTC m=+1530.997674905" lastFinishedPulling="2026-03-18 13:28:05.46852774 +0000 UTC m=+1533.927955165" observedRunningTime="2026-03-18 13:28:08.179545594 +0000 UTC m=+1536.638973039" watchObservedRunningTime="2026-03-18 13:28:08.207635858 +0000 UTC m=+1536.667063283" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.228523 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg7gt\" (UniqueName: \"kubernetes.io/projected/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-kube-api-access-fg7gt\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.301674 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" (UID: "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.344684 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.419364 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" (UID: "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.448799 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.519316 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data" (OuterVolumeSpecName: "config-data") pod "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" (UID: "1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.552393 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:08 crc kubenswrapper[4912]: I0318 13:28:08.736335 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2c6pp"] Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.116484 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" event={"ID":"99566935-653a-45d0-94fb-84e8e27435f9","Type":"ContainerStarted","Data":"4d33c1cbcaccbacd94617e691e90822b0795c214b401d17eac653bfaedba716a"} Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.120850 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerStarted","Data":"30fb25555fb3076f2f1f7c15b85edad2b27be8c4b3c967742fff9306f06a2ad1"} Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.120914 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6f768dfb8d-bq88g" Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.180687 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.205244 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6f768dfb8d-bq88g"] Mar 18 13:28:09 crc kubenswrapper[4912]: I0318 13:28:09.541912 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:10 crc kubenswrapper[4912]: I0318 13:28:10.164527 4912 generic.go:334] "Generic (PLEG): container finished" podID="284c4dc7-ec24-4baa-91e7-f0540ed73054" containerID="d123da7fc856b3287879b131319b8b60419c82b22c02106327aba5b7ab445734" exitCode=0 Mar 18 13:28:10 crc kubenswrapper[4912]: I0318 13:28:10.164606 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-fqc26" event={"ID":"284c4dc7-ec24-4baa-91e7-f0540ed73054","Type":"ContainerDied","Data":"d123da7fc856b3287879b131319b8b60419c82b22c02106327aba5b7ab445734"} Mar 18 13:28:10 crc kubenswrapper[4912]: I0318 13:28:10.257964 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" path="/var/lib/kubelet/pods/1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2/volumes" Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.198917 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-central-agent" containerID="cri-o://2d4a17e10ed963d27017c3ee8e718f356cb8f9b68eadd72278df6e7a584d7be4" gracePeriod=30 Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.199250 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerStarted","Data":"4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c"} Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.199314 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-notification-agent" containerID="cri-o://155b9fc578f4fd0269822b4f554166e106407f820bf5df84871e0eb0540ed4e5" gracePeriod=30 Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.199319 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="sg-core" containerID="cri-o://30fb25555fb3076f2f1f7c15b85edad2b27be8c4b3c967742fff9306f06a2ad1" gracePeriod=30 Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.199374 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="proxy-httpd" containerID="cri-o://4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c" gracePeriod=30 Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.199963 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.255110 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5099022 podStartE2EDuration="10.255072902s" podCreationTimestamp="2026-03-18 13:28:01 +0000 UTC" firstStartedPulling="2026-03-18 13:28:03.041780515 +0000 UTC m=+1531.501207940" lastFinishedPulling="2026-03-18 13:28:10.786951217 +0000 UTC m=+1539.246378642" observedRunningTime="2026-03-18 13:28:11.242158425 +0000 UTC m=+1539.701585860" watchObservedRunningTime="2026-03-18 13:28:11.255072902 +0000 UTC m=+1539.714500327" Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.958650 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:28:11 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:28:11 crc kubenswrapper[4912]: > Mar 18 13:28:11 crc kubenswrapper[4912]: I0318 13:28:11.974911 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.095226 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9dk\" (UniqueName: \"kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk\") pod \"284c4dc7-ec24-4baa-91e7-f0540ed73054\" (UID: \"284c4dc7-ec24-4baa-91e7-f0540ed73054\") " Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.128495 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk" (OuterVolumeSpecName: "kube-api-access-ls9dk") pod "284c4dc7-ec24-4baa-91e7-f0540ed73054" (UID: "284c4dc7-ec24-4baa-91e7-f0540ed73054"). InnerVolumeSpecName "kube-api-access-ls9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.198862 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9dk\" (UniqueName: \"kubernetes.io/projected/284c4dc7-ec24-4baa-91e7-f0540ed73054-kube-api-access-ls9dk\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.222179 4912 generic.go:334] "Generic (PLEG): container finished" podID="93d1c930-c034-4bae-9226-0b277f4e7542" containerID="30fb25555fb3076f2f1f7c15b85edad2b27be8c4b3c967742fff9306f06a2ad1" exitCode=2 Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.222320 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerDied","Data":"30fb25555fb3076f2f1f7c15b85edad2b27be8c4b3c967742fff9306f06a2ad1"} Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.239880 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-fqc26" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.257810 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-fqc26" event={"ID":"284c4dc7-ec24-4baa-91e7-f0540ed73054","Type":"ContainerDied","Data":"7404cfb4389e9f009a1206ffa6bf856ab699351e438a35d7280b2623572b7455"} Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.257873 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7404cfb4389e9f009a1206ffa6bf856ab699351e438a35d7280b2623572b7455" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.750016 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:28:12 crc kubenswrapper[4912]: I0318 13:28:12.844838 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.087133 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-mtm5z"] Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.106858 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-mtm5z"] Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.284924 4912 generic.go:334] "Generic (PLEG): container finished" podID="93d1c930-c034-4bae-9226-0b277f4e7542" containerID="155b9fc578f4fd0269822b4f554166e106407f820bf5df84871e0eb0540ed4e5" exitCode=0 Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.285193 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerDied","Data":"155b9fc578f4fd0269822b4f554166e106407f820bf5df84871e0eb0540ed4e5"} Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.512330 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.551853 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbplg\" (UniqueName: \"kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg\") pod \"6de3a595-b691-490f-961d-e0471af1f517\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.552332 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom\") pod \"6de3a595-b691-490f-961d-e0471af1f517\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.552369 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle\") pod \"6de3a595-b691-490f-961d-e0471af1f517\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.552395 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data\") pod \"6de3a595-b691-490f-961d-e0471af1f517\" (UID: \"6de3a595-b691-490f-961d-e0471af1f517\") " Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.581643 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6de3a595-b691-490f-961d-e0471af1f517" (UID: "6de3a595-b691-490f-961d-e0471af1f517"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.615860 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg" (OuterVolumeSpecName: "kube-api-access-cbplg") pod "6de3a595-b691-490f-961d-e0471af1f517" (UID: "6de3a595-b691-490f-961d-e0471af1f517"). InnerVolumeSpecName "kube-api-access-cbplg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.657514 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.657767 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbplg\" (UniqueName: \"kubernetes.io/projected/6de3a595-b691-490f-961d-e0471af1f517-kube-api-access-cbplg\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.676929 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data" (OuterVolumeSpecName: "config-data") pod "6de3a595-b691-490f-961d-e0471af1f517" (UID: "6de3a595-b691-490f-961d-e0471af1f517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.740220 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de3a595-b691-490f-961d-e0471af1f517" (UID: "6de3a595-b691-490f-961d-e0471af1f517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.764645 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:13 crc kubenswrapper[4912]: I0318 13:28:13.764683 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de3a595-b691-490f-961d-e0471af1f517-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.250264 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b2deea-af2a-420b-a2c1-6b109851ce15" path="/var/lib/kubelet/pods/c3b2deea-af2a-420b-a2c1-6b109851ce15/volumes" Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.304898 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6459544-dzg22" event={"ID":"6de3a595-b691-490f-961d-e0471af1f517","Type":"ContainerDied","Data":"05e837a51c9e30ec3887f672ec245a4e74bcf9d0b13350dc5bf54206b449db04"} Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.305174 4912 scope.go:117] "RemoveContainer" containerID="bf90d5bf159b44e990b77f7aa5d50d44290711eaadb26e4b11f765c3f60539fc" Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.305518 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6459544-dzg22" Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.353201 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:28:14 crc kubenswrapper[4912]: I0318 13:28:14.370344 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7f6459544-dzg22"] Mar 18 13:28:15 crc kubenswrapper[4912]: I0318 13:28:15.099007 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:28:15 crc kubenswrapper[4912]: I0318 13:28:15.169501 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:28:15 crc kubenswrapper[4912]: I0318 13:28:15.169846 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-bb6dbd969-rqk8b" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" containerID="cri-o://89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" gracePeriod=60 Mar 18 13:28:16 crc kubenswrapper[4912]: I0318 13:28:16.246129 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de3a595-b691-490f-961d-e0471af1f517" path="/var/lib/kubelet/pods/6de3a595-b691-490f-961d-e0471af1f517/volumes" Mar 18 13:28:17 crc kubenswrapper[4912]: I0318 13:28:17.375789 4912 generic.go:334] "Generic (PLEG): container finished" podID="93d1c930-c034-4bae-9226-0b277f4e7542" containerID="2d4a17e10ed963d27017c3ee8e718f356cb8f9b68eadd72278df6e7a584d7be4" exitCode=0 Mar 18 13:28:17 crc kubenswrapper[4912]: I0318 13:28:17.376517 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerDied","Data":"2d4a17e10ed963d27017c3ee8e718f356cb8f9b68eadd72278df6e7a584d7be4"} Mar 18 13:28:17 crc kubenswrapper[4912]: E0318 13:28:17.511107 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:17 crc kubenswrapper[4912]: E0318 13:28:17.512561 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:17 crc kubenswrapper[4912]: E0318 13:28:17.513846 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:17 crc kubenswrapper[4912]: E0318 13:28:17.513888 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-bb6dbd969-rqk8b" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" Mar 18 13:28:20 crc kubenswrapper[4912]: I0318 13:28:20.149796 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 13:28:21 crc kubenswrapper[4912]: I0318 13:28:21.020610 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:28:21 crc kubenswrapper[4912]: I0318 13:28:21.126962 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:28:21 crc kubenswrapper[4912]: I0318 13:28:21.283265 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:28:22 crc kubenswrapper[4912]: I0318 13:28:22.264422 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="f74a682c-ca05-498c-ab11-4ccf3d7d3b46" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.235:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:28:22 crc kubenswrapper[4912]: I0318 13:28:22.521023 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rqm8k" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" containerID="cri-o://06215be31d7302bcf66e43bca95aede8926ab3093133d6005fafd86c3e8ebf11" gracePeriod=2 Mar 18 13:28:23 crc kubenswrapper[4912]: I0318 13:28:23.550139 4912 generic.go:334] "Generic (PLEG): container finished" podID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerID="06215be31d7302bcf66e43bca95aede8926ab3093133d6005fafd86c3e8ebf11" exitCode=0 Mar 18 13:28:23 crc kubenswrapper[4912]: I0318 13:28:23.550212 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerDied","Data":"06215be31d7302bcf66e43bca95aede8926ab3093133d6005fafd86c3e8ebf11"} Mar 18 13:28:25 crc kubenswrapper[4912]: I0318 13:28:25.584879 4912 generic.go:334] "Generic (PLEG): container finished" podID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" exitCode=0 Mar 18 13:28:25 crc kubenswrapper[4912]: I0318 13:28:25.585707 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bb6dbd969-rqk8b" event={"ID":"802b4150-7c77-4913-9bd5-94cb3ecf7895","Type":"ContainerDied","Data":"89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52"} Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.349485 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.350060 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dw24s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-2c6pp_openstack(99566935-653a-45d0-94fb-84e8e27435f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.351200 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" podUID="99566935-653a-45d0-94fb-84e8e27435f9" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.471985 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.509280 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52 is running failed: container process not found" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.509812 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52 is running failed: container process not found" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.517825 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52 is running failed: container process not found" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.517918 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-bb6dbd969-rqk8b" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.538683 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8znv\" (UniqueName: \"kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv\") pod \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.539163 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content\") pod \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.539267 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities\") pod \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\" (UID: \"01119d5a-cc51-4ab9-8bb6-17a0893b3da0\") " Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.540925 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities" (OuterVolumeSpecName: "utilities") pod "01119d5a-cc51-4ab9-8bb6-17a0893b3da0" (UID: "01119d5a-cc51-4ab9-8bb6-17a0893b3da0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.556158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv" (OuterVolumeSpecName: "kube-api-access-c8znv") pod "01119d5a-cc51-4ab9-8bb6-17a0893b3da0" (UID: "01119d5a-cc51-4ab9-8bb6-17a0893b3da0"). InnerVolumeSpecName "kube-api-access-c8znv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.632512 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqm8k" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.632757 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqm8k" event={"ID":"01119d5a-cc51-4ab9-8bb6-17a0893b3da0","Type":"ContainerDied","Data":"dfd61faefc60713ecbb71ac2dd5a2fb6b26c6b4bd12b39531bc0b5cfd9337518"} Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.632800 4912 scope.go:117] "RemoveContainer" containerID="06215be31d7302bcf66e43bca95aede8926ab3093133d6005fafd86c3e8ebf11" Mar 18 13:28:27 crc kubenswrapper[4912]: E0318 13:28:27.635970 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" podUID="99566935-653a-45d0-94fb-84e8e27435f9" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.642712 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8znv\" (UniqueName: \"kubernetes.io/projected/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-kube-api-access-c8znv\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.642749 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.772671 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01119d5a-cc51-4ab9-8bb6-17a0893b3da0" (UID: "01119d5a-cc51-4ab9-8bb6-17a0893b3da0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.788546 4912 scope.go:117] "RemoveContainer" containerID="479c70d89d7e033b8d2e89e21de1199ed685ea2ffe4daaf49022e1097269e964" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.851577 4912 scope.go:117] "RemoveContainer" containerID="b6a5ecd7ae529677c227d79a14d7f9a37439dc4dd0e81e0ca38592882679895d" Mar 18 13:28:27 crc kubenswrapper[4912]: I0318 13:28:27.854516 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01119d5a-cc51-4ab9-8bb6-17a0893b3da0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.095481 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.118820 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.131128 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rqm8k"] Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.162254 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom\") pod \"802b4150-7c77-4913-9bd5-94cb3ecf7895\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.162482 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4flc5\" (UniqueName: \"kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5\") pod \"802b4150-7c77-4913-9bd5-94cb3ecf7895\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.162534 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle\") pod \"802b4150-7c77-4913-9bd5-94cb3ecf7895\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.162813 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data\") pod \"802b4150-7c77-4913-9bd5-94cb3ecf7895\" (UID: \"802b4150-7c77-4913-9bd5-94cb3ecf7895\") " Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.172138 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "802b4150-7c77-4913-9bd5-94cb3ecf7895" (UID: "802b4150-7c77-4913-9bd5-94cb3ecf7895"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.191289 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5" (OuterVolumeSpecName: "kube-api-access-4flc5") pod "802b4150-7c77-4913-9bd5-94cb3ecf7895" (UID: "802b4150-7c77-4913-9bd5-94cb3ecf7895"). InnerVolumeSpecName "kube-api-access-4flc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.241224 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "802b4150-7c77-4913-9bd5-94cb3ecf7895" (UID: "802b4150-7c77-4913-9bd5-94cb3ecf7895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.251727 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" path="/var/lib/kubelet/pods/01119d5a-cc51-4ab9-8bb6-17a0893b3da0/volumes" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.268393 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.268437 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4flc5\" (UniqueName: \"kubernetes.io/projected/802b4150-7c77-4913-9bd5-94cb3ecf7895-kube-api-access-4flc5\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.268452 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.280572 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data" (OuterVolumeSpecName: "config-data") pod "802b4150-7c77-4913-9bd5-94cb3ecf7895" (UID: "802b4150-7c77-4913-9bd5-94cb3ecf7895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.373225 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802b4150-7c77-4913-9bd5-94cb3ecf7895-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.646589 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bb6dbd969-rqk8b" event={"ID":"802b4150-7c77-4913-9bd5-94cb3ecf7895","Type":"ContainerDied","Data":"554ed5b43999a761b436554ca012322af6a978c53f795d7ddb34fb9b4e6e5cd6"} Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.646663 4912 scope.go:117] "RemoveContainer" containerID="89e490d70b8b7ae49b0e3dd149d2a61c2bc8967fdccf0164ecbdf53a9ee10c52" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.646743 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bb6dbd969-rqk8b" Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.690668 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:28:28 crc kubenswrapper[4912]: I0318 13:28:28.700566 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-bb6dbd969-rqk8b"] Mar 18 13:28:30 crc kubenswrapper[4912]: I0318 13:28:30.243140 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" path="/var/lib/kubelet/pods/802b4150-7c77-4913-9bd5-94cb3ecf7895/volumes" Mar 18 13:28:32 crc kubenswrapper[4912]: I0318 13:28:32.236622 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 13:28:34 crc kubenswrapper[4912]: I0318 13:28:34.970883 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:34 crc kubenswrapper[4912]: I0318 13:28:34.971664 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-log" containerID="cri-o://42e0127644f532b348025f7df7619a404bc946dab2301b65068c037fe8bc9f8d" gracePeriod=30 Mar 18 13:28:34 crc kubenswrapper[4912]: I0318 13:28:34.971811 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-httpd" containerID="cri-o://00eca2aec65f34cd0d48bc6e825d46de32abceac203177a231daac5d742dfb97" gracePeriod=30 Mar 18 13:28:35 crc kubenswrapper[4912]: I0318 13:28:35.757818 4912 generic.go:334] "Generic (PLEG): container finished" podID="9a01f4d6-98d9-49df-b056-436f333909bb" containerID="42e0127644f532b348025f7df7619a404bc946dab2301b65068c037fe8bc9f8d" exitCode=143 Mar 18 13:28:35 crc kubenswrapper[4912]: I0318 13:28:35.757878 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerDied","Data":"42e0127644f532b348025f7df7619a404bc946dab2301b65068c037fe8bc9f8d"} Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.130685 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.131588 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-log" containerID="cri-o://fea833f54f21a2c58861ffc97de67f6c570ec8fd3429ab4275e58e62000dc932" gracePeriod=30 Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.131647 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-httpd" containerID="cri-o://f5801bb0293739b2976ca898be44b5d6224ac06ff96f49158776c37517b87d88" gracePeriod=30 Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.771373 4912 generic.go:334] "Generic (PLEG): container finished" podID="2b5cde83-581f-4589-b094-2248eb7430d3" containerID="fea833f54f21a2c58861ffc97de67f6c570ec8fd3429ab4275e58e62000dc932" exitCode=143 Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.771434 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerDied","Data":"fea833f54f21a2c58861ffc97de67f6c570ec8fd3429ab4275e58e62000dc932"} Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.998987 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.999119 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:28:36 crc kubenswrapper[4912]: I0318 13:28:36.999217 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:36.999966 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:37.000070 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db" gracePeriod=600 Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:37.789597 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db" exitCode=0 Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:37.789651 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db"} Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:37.789684 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6"} Mar 18 13:28:37 crc kubenswrapper[4912]: I0318 13:28:37.789705 4912 scope.go:117] "RemoveContainer" containerID="905ac753c282bcb8ff90a4cd97a7fced94d0b0133ec58021181ae4cebb3a39a9" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.807998 4912 generic.go:334] "Generic (PLEG): container finished" podID="9a01f4d6-98d9-49df-b056-436f333909bb" containerID="00eca2aec65f34cd0d48bc6e825d46de32abceac203177a231daac5d742dfb97" exitCode=0 Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.808110 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerDied","Data":"00eca2aec65f34cd0d48bc6e825d46de32abceac203177a231daac5d742dfb97"} Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.808920 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a01f4d6-98d9-49df-b056-436f333909bb","Type":"ContainerDied","Data":"87ad7cfb29fe239b2c0f88dea6382c3d0d3aa0ff7ec626976e4287bcdb57293a"} Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.808943 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ad7cfb29fe239b2c0f88dea6382c3d0d3aa0ff7ec626976e4287bcdb57293a" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.813624 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.866920 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.867024 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.867086 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.867145 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.867261 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.867296 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vnx8\" (UniqueName: \"kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.868283 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.868344 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts\") pod \"9a01f4d6-98d9-49df-b056-436f333909bb\" (UID: \"9a01f4d6-98d9-49df-b056-436f333909bb\") " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.868894 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs" (OuterVolumeSpecName: "logs") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.869528 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.870228 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.870251 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a01f4d6-98d9-49df-b056-436f333909bb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.881257 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts" (OuterVolumeSpecName: "scripts") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.903347 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8" (OuterVolumeSpecName: "kube-api-access-4vnx8") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "kube-api-access-4vnx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.916641 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a" (OuterVolumeSpecName: "glance") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.954987 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.977958 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.978002 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vnx8\" (UniqueName: \"kubernetes.io/projected/9a01f4d6-98d9-49df-b056-436f333909bb-kube-api-access-4vnx8\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.978070 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") on node \"crc\" " Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.978082 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:38 crc kubenswrapper[4912]: I0318 13:28:38.984418 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.004368 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data" (OuterVolumeSpecName: "config-data") pod "9a01f4d6-98d9-49df-b056-436f333909bb" (UID: "9a01f4d6-98d9-49df-b056-436f333909bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.046373 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.046567 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a") on node "crc" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.080581 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.080623 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.080638 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a01f4d6-98d9-49df-b056-436f333909bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.836914 4912 generic.go:334] "Generic (PLEG): container finished" podID="2b5cde83-581f-4589-b094-2248eb7430d3" containerID="f5801bb0293739b2976ca898be44b5d6224ac06ff96f49158776c37517b87d88" exitCode=0 Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.837524 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.841261 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerDied","Data":"f5801bb0293739b2976ca898be44b5d6224ac06ff96f49158776c37517b87d88"} Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.841352 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b5cde83-581f-4589-b094-2248eb7430d3","Type":"ContainerDied","Data":"17e992449c8c65254f1a68e296534fa2d54b3b6ea3494e5c5a8ae94577f46005"} Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.841367 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17e992449c8c65254f1a68e296534fa2d54b3b6ea3494e5c5a8ae94577f46005" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.930199 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.958280 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:39 crc kubenswrapper[4912]: I0318 13:28:39.976149 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004021 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004193 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004254 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004402 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004450 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76rdk\" (UniqueName: \"kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004504 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.004596 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.007921 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"2b5cde83-581f-4589-b094-2248eb7430d3\" (UID: \"2b5cde83-581f-4589-b094-2248eb7430d3\") " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.013348 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.013714 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs" (OuterVolumeSpecName: "logs") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.020223 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts" (OuterVolumeSpecName: "scripts") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.034246 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk" (OuterVolumeSpecName: "kube-api-access-76rdk") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "kube-api-access-76rdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043141 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.043860 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043890 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.043911 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043920 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.043934 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043943 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.043957 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043965 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.043989 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.043996 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044010 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="extract-utilities" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044019 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="extract-utilities" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044053 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044063 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044083 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044094 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044115 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044124 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044149 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044157 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044169 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044179 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044187 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="extract-content" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044196 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="extract-content" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.044217 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284c4dc7-ec24-4baa-91e7-f0540ed73054" containerName="oc" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044225 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="284c4dc7-ec24-4baa-91e7-f0540ed73054" containerName="oc" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044505 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044520 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044534 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="802b4150-7c77-4913-9bd5-94cb3ecf7895" containerName="heat-engine" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044555 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044575 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="01119d5a-cc51-4ab9-8bb6-17a0893b3da0" containerName="registry-server" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044586 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-httpd" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044604 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.044615 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" containerName="glance-log" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.051311 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="284c4dc7-ec24-4baa-91e7-f0540ed73054" containerName="oc" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.056291 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6d8995-d0ca-4a57-83a3-1b6350ac2bf2" containerName="heat-api" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.056374 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de3a595-b691-490f-961d-e0471af1f517" containerName="heat-cfnapi" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.059865 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.060160 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.065555 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.075308 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.120473 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76rdk\" (UniqueName: \"kubernetes.io/projected/2b5cde83-581f-4589-b094-2248eb7430d3-kube-api-access-76rdk\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.120519 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.120531 4912 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.120545 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5cde83-581f-4589-b094-2248eb7430d3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.126276 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702" (OuterVolumeSpecName: "glance") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "pvc-7a77a725-7984-4a1f-952c-59e02a3e3702". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.174664 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data" (OuterVolumeSpecName: "config-data") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.180128 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.190862 4912 scope.go:117] "RemoveContainer" containerID="f99c3661c41f36b7fce4ed7988b38fa923c7af3480b1e27b1820d0f3b4b5255b" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.215969 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b5cde83-581f-4589-b094-2248eb7430d3" (UID: "2b5cde83-581f-4589-b094-2248eb7430d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223001 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223157 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223189 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223258 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223334 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-logs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223355 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftgw\" (UniqueName: \"kubernetes.io/projected/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-kube-api-access-cftgw\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223417 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223440 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223527 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223553 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") on node \"crc\" " Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223564 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.223575 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5cde83-581f-4589-b094-2248eb7430d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.294414 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.295077 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7a77a725-7984-4a1f-952c-59e02a3e3702" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702") on node "crc" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.300211 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a01f4d6-98d9-49df-b056-436f333909bb" path="/var/lib/kubelet/pods/9a01f4d6-98d9-49df-b056-436f333909bb/volumes" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325523 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325581 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325612 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325675 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325767 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-logs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325788 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftgw\" (UniqueName: \"kubernetes.io/projected/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-kube-api-access-cftgw\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325854 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.325874 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.326007 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.326311 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.326501 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-logs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.334849 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.335113 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a72a99aa2c60b032f6beea238cfa834e3b34d6f5ea6f566ee310bef28b7d80e1/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.335445 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.337573 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.338251 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.338821 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.364783 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftgw\" (UniqueName: \"kubernetes.io/projected/8dc0c338-1e2c-43b7-9d84-96a42e7df1a5-kube-api-access-cftgw\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.407605 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25e1bdf6-b2c7-4ade-b767-9994bbbb789a\") pod \"glance-default-external-api-0\" (UID: \"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5\") " pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.558917 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.691816 4912 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_2b5cde83-581f-4589-b094-2248eb7430d3/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_2b5cde83-581f-4589-b094-2248eb7430d3/glance-httpd/0.log: no such file or directory Mar 18 13:28:40 crc kubenswrapper[4912]: E0318 13:28:40.714027 4912 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-external-api-0_9a01f4d6-98d9-49df-b056-436f333909bb/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-external-api-0_9a01f4d6-98d9-49df-b056-436f333909bb/glance-httpd/0.log: no such file or directory Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.855188 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.945977 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:40 crc kubenswrapper[4912]: I0318 13:28:40.976803 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.007129 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.009737 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.033542 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.033736 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.037874 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.162636 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.162754 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.162889 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.163141 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.163246 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.163474 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-logs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.163578 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrcf\" (UniqueName: \"kubernetes.io/projected/84708bef-9104-4bd8-8437-b068dfcb9f65-kube-api-access-fkrcf\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.163651 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266485 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266635 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-logs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266681 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrcf\" (UniqueName: \"kubernetes.io/projected/84708bef-9104-4bd8-8437-b068dfcb9f65-kube-api-access-fkrcf\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266712 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266761 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266799 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266873 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.266987 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.267358 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-logs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.270685 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84708bef-9104-4bd8-8437-b068dfcb9f65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.276664 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.276701 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.276705 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.277792 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84708bef-9104-4bd8-8437-b068dfcb9f65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.277936 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.277964 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26f2544761a6fbc0806c14fe58682cf74b4da7181bc2e0537e305660953f7255/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.299932 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrcf\" (UniqueName: \"kubernetes.io/projected/84708bef-9104-4bd8-8437-b068dfcb9f65-kube-api-access-fkrcf\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.410801 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a77a725-7984-4a1f-952c-59e02a3e3702\") pod \"glance-default-internal-api-0\" (UID: \"84708bef-9104-4bd8-8437-b068dfcb9f65\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.479414 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.735576 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:41 crc kubenswrapper[4912]: E0318 13:28:41.866451 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d1c930_c034_4bae_9226_0b277f4e7542.slice/crio-conmon-4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d1c930_c034_4bae_9226_0b277f4e7542.slice/crio-4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.952456 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5","Type":"ContainerStarted","Data":"20f3c749f8c1d0377f297eef12778b0ecdc230fc81f222598a063fb7f7c55a60"} Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.998564 4912 generic.go:334] "Generic (PLEG): container finished" podID="93d1c930-c034-4bae-9226-0b277f4e7542" containerID="4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c" exitCode=137 Mar 18 13:28:41 crc kubenswrapper[4912]: I0318 13:28:41.999257 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerDied","Data":"4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c"} Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.037566 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126116 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126205 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126299 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ctqr\" (UniqueName: \"kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126441 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126516 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126592 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.126707 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd\") pod \"93d1c930-c034-4bae-9226-0b277f4e7542\" (UID: \"93d1c930-c034-4bae-9226-0b277f4e7542\") " Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.127416 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.128062 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.128195 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.135422 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts" (OuterVolumeSpecName: "scripts") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.138251 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr" (OuterVolumeSpecName: "kube-api-access-4ctqr") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "kube-api-access-4ctqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.175276 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.230306 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93d1c930-c034-4bae-9226-0b277f4e7542-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.230354 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ctqr\" (UniqueName: \"kubernetes.io/projected/93d1c930-c034-4bae-9226-0b277f4e7542-kube-api-access-4ctqr\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.230366 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.230374 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.278630 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5cde83-581f-4589-b094-2248eb7430d3" path="/var/lib/kubelet/pods/2b5cde83-581f-4589-b094-2248eb7430d3/volumes" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.304387 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.316292 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data" (OuterVolumeSpecName: "config-data") pod "93d1c930-c034-4bae-9226-0b277f4e7542" (UID: "93d1c930-c034-4bae-9226-0b277f4e7542"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.333882 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.333931 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d1c930-c034-4bae-9226-0b277f4e7542-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:42 crc kubenswrapper[4912]: I0318 13:28:42.494087 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:28:42 crc kubenswrapper[4912]: W0318 13:28:42.495237 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84708bef_9104_4bd8_8437_b068dfcb9f65.slice/crio-59cba28d1670a4025025004bff7bb7a21cb6a84314db93248d68128f153f4e87 WatchSource:0}: Error finding container 59cba28d1670a4025025004bff7bb7a21cb6a84314db93248d68128f153f4e87: Status 404 returned error can't find the container with id 59cba28d1670a4025025004bff7bb7a21cb6a84314db93248d68128f153f4e87 Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.024100 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" event={"ID":"99566935-653a-45d0-94fb-84e8e27435f9","Type":"ContainerStarted","Data":"076a1e44f753d42e5cd16f4c0204df3f7e47f2149a494e94c9f1932723b2c2fb"} Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.028756 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5","Type":"ContainerStarted","Data":"923feb4334c83d60ad015c33e3fbf0edb4e805c0e52038496767d59bf3582534"} Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.035002 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93d1c930-c034-4bae-9226-0b277f4e7542","Type":"ContainerDied","Data":"d8a71904ad02dfc3640550c1911cd7ea4326b61df4e4997979e670b75e0da239"} Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.035113 4912 scope.go:117] "RemoveContainer" containerID="4758d37a125cb103e3c5f2e1670bde6db90f10f2890e036c684736adf109407c" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.035291 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.050185 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84708bef-9104-4bd8-8437-b068dfcb9f65","Type":"ContainerStarted","Data":"59cba28d1670a4025025004bff7bb7a21cb6a84314db93248d68128f153f4e87"} Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.052248 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" podStartSLOduration=2.93797254 podStartE2EDuration="36.052213918s" podCreationTimestamp="2026-03-18 13:28:07 +0000 UTC" firstStartedPulling="2026-03-18 13:28:08.751493835 +0000 UTC m=+1537.210921260" lastFinishedPulling="2026-03-18 13:28:41.865735213 +0000 UTC m=+1570.325162638" observedRunningTime="2026-03-18 13:28:43.043909355 +0000 UTC m=+1571.503336780" watchObservedRunningTime="2026-03-18 13:28:43.052213918 +0000 UTC m=+1571.511641363" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.116962 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.149330 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.156320 4912 scope.go:117] "RemoveContainer" containerID="30fb25555fb3076f2f1f7c15b85edad2b27be8c4b3c967742fff9306f06a2ad1" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.171370 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:43 crc kubenswrapper[4912]: E0318 13:28:43.172225 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="proxy-httpd" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172249 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="proxy-httpd" Mar 18 13:28:43 crc kubenswrapper[4912]: E0318 13:28:43.172303 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-notification-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172319 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-notification-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: E0318 13:28:43.172345 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="sg-core" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172355 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="sg-core" Mar 18 13:28:43 crc kubenswrapper[4912]: E0318 13:28:43.172393 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-central-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172403 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-central-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172730 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-central-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172770 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="sg-core" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172790 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="ceilometer-notification-agent" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.172814 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" containerName="proxy-httpd" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.177349 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.183882 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.186697 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.190008 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.252586 4912 scope.go:117] "RemoveContainer" containerID="155b9fc578f4fd0269822b4f554166e106407f820bf5df84871e0eb0540ed4e5" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.266223 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.266429 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.266746 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56m4\" (UniqueName: \"kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.266838 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.267122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.267272 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.267472 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.294732 4912 scope.go:117] "RemoveContainer" containerID="2d4a17e10ed963d27017c3ee8e718f356cb8f9b68eadd72278df6e7a584d7be4" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.370472 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371004 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56m4\" (UniqueName: \"kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371052 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371142 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371217 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371288 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.371409 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.372210 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.375585 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.381121 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.391528 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.401876 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56m4\" (UniqueName: \"kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.403875 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.414012 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " pod="openstack/ceilometer-0" Mar 18 13:28:43 crc kubenswrapper[4912]: I0318 13:28:43.507572 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:28:44 crc kubenswrapper[4912]: I0318 13:28:44.149680 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84708bef-9104-4bd8-8437-b068dfcb9f65","Type":"ContainerStarted","Data":"6e997c49884c772287f3b9cdeebe01599a67de4b6bceab86f2cf25f416c3d8e3"} Mar 18 13:28:44 crc kubenswrapper[4912]: I0318 13:28:44.190261 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dc0c338-1e2c-43b7-9d84-96a42e7df1a5","Type":"ContainerStarted","Data":"60a486431e1e5e362a25215349c2ee70e68abb37a2c1f876a7489b5362363b9c"} Mar 18 13:28:44 crc kubenswrapper[4912]: I0318 13:28:44.192777 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:44 crc kubenswrapper[4912]: W0318 13:28:44.288549 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ec2821f_350b_4435_8f0e_65b6d5c5b3ce.slice/crio-699a6ef722147c568a07a4fffdedb1cecbb45802cbff02000c58e11e216de0fe WatchSource:0}: Error finding container 699a6ef722147c568a07a4fffdedb1cecbb45802cbff02000c58e11e216de0fe: Status 404 returned error can't find the container with id 699a6ef722147c568a07a4fffdedb1cecbb45802cbff02000c58e11e216de0fe Mar 18 13:28:44 crc kubenswrapper[4912]: I0318 13:28:44.293280 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d1c930-c034-4bae-9226-0b277f4e7542" path="/var/lib/kubelet/pods/93d1c930-c034-4bae-9226-0b277f4e7542/volumes" Mar 18 13:28:45 crc kubenswrapper[4912]: I0318 13:28:45.212995 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerStarted","Data":"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653"} Mar 18 13:28:45 crc kubenswrapper[4912]: I0318 13:28:45.214168 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerStarted","Data":"699a6ef722147c568a07a4fffdedb1cecbb45802cbff02000c58e11e216de0fe"} Mar 18 13:28:45 crc kubenswrapper[4912]: I0318 13:28:45.216648 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84708bef-9104-4bd8-8437-b068dfcb9f65","Type":"ContainerStarted","Data":"39a23fc1dcae81bab40d30039f925ab0a8de403c2fe1c5d7ac4d296121287488"} Mar 18 13:28:45 crc kubenswrapper[4912]: I0318 13:28:45.257376 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.257329145 podStartE2EDuration="5.257329145s" podCreationTimestamp="2026-03-18 13:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:45.239504486 +0000 UTC m=+1573.698931931" watchObservedRunningTime="2026-03-18 13:28:45.257329145 +0000 UTC m=+1573.716756580" Mar 18 13:28:45 crc kubenswrapper[4912]: I0318 13:28:45.261628 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.261605199 podStartE2EDuration="6.261605199s" podCreationTimestamp="2026-03-18 13:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:28:44.238662533 +0000 UTC m=+1572.698089978" watchObservedRunningTime="2026-03-18 13:28:45.261605199 +0000 UTC m=+1573.721032624" Mar 18 13:28:46 crc kubenswrapper[4912]: I0318 13:28:46.256349 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerStarted","Data":"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563"} Mar 18 13:28:47 crc kubenswrapper[4912]: I0318 13:28:47.141456 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:28:47 crc kubenswrapper[4912]: I0318 13:28:47.251097 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerStarted","Data":"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d"} Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.314318 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerStarted","Data":"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30"} Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.314877 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-central-agent" containerID="cri-o://0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653" gracePeriod=30 Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.315165 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.315351 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="proxy-httpd" containerID="cri-o://847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30" gracePeriod=30 Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.315383 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-notification-agent" containerID="cri-o://f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563" gracePeriod=30 Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.315455 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="sg-core" containerID="cri-o://e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d" gracePeriod=30 Mar 18 13:28:49 crc kubenswrapper[4912]: I0318 13:28:49.343199 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.210417285 podStartE2EDuration="6.34317335s" podCreationTimestamp="2026-03-18 13:28:43 +0000 UTC" firstStartedPulling="2026-03-18 13:28:44.333452007 +0000 UTC m=+1572.792879452" lastFinishedPulling="2026-03-18 13:28:48.466208092 +0000 UTC m=+1576.925635517" observedRunningTime="2026-03-18 13:28:49.338667349 +0000 UTC m=+1577.798094784" watchObservedRunningTime="2026-03-18 13:28:49.34317335 +0000 UTC m=+1577.802600775" Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.330359 4912 generic.go:334] "Generic (PLEG): container finished" podID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerID="847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30" exitCode=0 Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.330874 4912 generic.go:334] "Generic (PLEG): container finished" podID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerID="e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d" exitCode=2 Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.330891 4912 generic.go:334] "Generic (PLEG): container finished" podID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerID="f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563" exitCode=0 Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.330459 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerDied","Data":"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30"} Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.330955 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerDied","Data":"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d"} Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.331185 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerDied","Data":"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563"} Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.559419 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.559492 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.608917 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:28:50 crc kubenswrapper[4912]: I0318 13:28:50.615948 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.342336 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.342872 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.737812 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.737877 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.796003 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:51 crc kubenswrapper[4912]: I0318 13:28:51.806002 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:52 crc kubenswrapper[4912]: I0318 13:28:52.355454 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:52 crc kubenswrapper[4912]: I0318 13:28:52.356114 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:54 crc kubenswrapper[4912]: I0318 13:28:54.297587 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:28:54 crc kubenswrapper[4912]: I0318 13:28:54.298163 4912 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:28:54 crc kubenswrapper[4912]: I0318 13:28:54.671025 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:28:55 crc kubenswrapper[4912]: I0318 13:28:55.013111 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:55 crc kubenswrapper[4912]: I0318 13:28:55.013735 4912 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:28:55 crc kubenswrapper[4912]: I0318 13:28:55.016986 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:28:57 crc kubenswrapper[4912]: I0318 13:28:57.440377 4912 generic.go:334] "Generic (PLEG): container finished" podID="99566935-653a-45d0-94fb-84e8e27435f9" containerID="076a1e44f753d42e5cd16f4c0204df3f7e47f2149a494e94c9f1932723b2c2fb" exitCode=0 Mar 18 13:28:57 crc kubenswrapper[4912]: I0318 13:28:57.440443 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" event={"ID":"99566935-653a-45d0-94fb-84e8e27435f9","Type":"ContainerDied","Data":"076a1e44f753d42e5cd16f4c0204df3f7e47f2149a494e94c9f1932723b2c2fb"} Mar 18 13:28:58 crc kubenswrapper[4912]: I0318 13:28:58.958194 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.027273 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data\") pod \"99566935-653a-45d0-94fb-84e8e27435f9\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.027327 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts\") pod \"99566935-653a-45d0-94fb-84e8e27435f9\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.027370 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle\") pod \"99566935-653a-45d0-94fb-84e8e27435f9\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.027631 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw24s\" (UniqueName: \"kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s\") pod \"99566935-653a-45d0-94fb-84e8e27435f9\" (UID: \"99566935-653a-45d0-94fb-84e8e27435f9\") " Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.037384 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s" (OuterVolumeSpecName: "kube-api-access-dw24s") pod "99566935-653a-45d0-94fb-84e8e27435f9" (UID: "99566935-653a-45d0-94fb-84e8e27435f9"). InnerVolumeSpecName "kube-api-access-dw24s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.039937 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts" (OuterVolumeSpecName: "scripts") pod "99566935-653a-45d0-94fb-84e8e27435f9" (UID: "99566935-653a-45d0-94fb-84e8e27435f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.107516 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99566935-653a-45d0-94fb-84e8e27435f9" (UID: "99566935-653a-45d0-94fb-84e8e27435f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.107911 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data" (OuterVolumeSpecName: "config-data") pod "99566935-653a-45d0-94fb-84e8e27435f9" (UID: "99566935-653a-45d0-94fb-84e8e27435f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.131362 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw24s\" (UniqueName: \"kubernetes.io/projected/99566935-653a-45d0-94fb-84e8e27435f9-kube-api-access-dw24s\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.131446 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.131459 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.131472 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99566935-653a-45d0-94fb-84e8e27435f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.471388 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" event={"ID":"99566935-653a-45d0-94fb-84e8e27435f9","Type":"ContainerDied","Data":"4d33c1cbcaccbacd94617e691e90822b0795c214b401d17eac653bfaedba716a"} Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.471479 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d33c1cbcaccbacd94617e691e90822b0795c214b401d17eac653bfaedba716a" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.472416 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2c6pp" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.603420 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:28:59 crc kubenswrapper[4912]: E0318 13:28:59.604175 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99566935-653a-45d0-94fb-84e8e27435f9" containerName="nova-cell0-conductor-db-sync" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.604199 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="99566935-653a-45d0-94fb-84e8e27435f9" containerName="nova-cell0-conductor-db-sync" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.604431 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="99566935-653a-45d0-94fb-84e8e27435f9" containerName="nova-cell0-conductor-db-sync" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.605401 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.607498 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bjjnw" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.608537 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.627569 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.646968 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.647252 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.647298 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxj6f\" (UniqueName: \"kubernetes.io/projected/be757545-6411-4e1a-bd46-6cddf5a22d61-kube-api-access-vxj6f\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.749278 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.749749 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxj6f\" (UniqueName: \"kubernetes.io/projected/be757545-6411-4e1a-bd46-6cddf5a22d61-kube-api-access-vxj6f\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.749847 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.756853 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.756870 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be757545-6411-4e1a-bd46-6cddf5a22d61-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.766720 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxj6f\" (UniqueName: \"kubernetes.io/projected/be757545-6411-4e1a-bd46-6cddf5a22d61-kube-api-access-vxj6f\") pod \"nova-cell0-conductor-0\" (UID: \"be757545-6411-4e1a-bd46-6cddf5a22d61\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:28:59 crc kubenswrapper[4912]: I0318 13:28:59.922996 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.266271 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.372710 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.372950 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373197 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373358 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373448 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373476 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56m4\" (UniqueName: \"kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4\") pod \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\" (UID: \"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce\") " Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.373917 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.374263 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.374338 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.389750 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts" (OuterVolumeSpecName: "scripts") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.410372 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4" (OuterVolumeSpecName: "kube-api-access-s56m4") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "kube-api-access-s56m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.451474 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.476536 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.476570 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.476580 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.476600 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56m4\" (UniqueName: \"kubernetes.io/projected/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-kube-api-access-s56m4\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.485730 4912 generic.go:334] "Generic (PLEG): container finished" podID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerID="0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653" exitCode=0 Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.485868 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerDied","Data":"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653"} Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.485907 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ec2821f-350b-4435-8f0e-65b6d5c5b3ce","Type":"ContainerDied","Data":"699a6ef722147c568a07a4fffdedb1cecbb45802cbff02000c58e11e216de0fe"} Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.485946 4912 scope.go:117] "RemoveContainer" containerID="847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.486216 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.552945 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.554220 4912 scope.go:117] "RemoveContainer" containerID="e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.578827 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.583469 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.585890 4912 scope.go:117] "RemoveContainer" containerID="f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.590445 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data" (OuterVolumeSpecName: "config-data") pod "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" (UID: "8ec2821f-350b-4435-8f0e-65b6d5c5b3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.621286 4912 scope.go:117] "RemoveContainer" containerID="0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.648734 4912 scope.go:117] "RemoveContainer" containerID="847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.649333 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30\": container with ID starting with 847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30 not found: ID does not exist" containerID="847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.649372 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30"} err="failed to get container status \"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30\": rpc error: code = NotFound desc = could not find container \"847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30\": container with ID starting with 847425d505e2beca7545538d462dbe210e38de50a91ccdd531d1df8426d63f30 not found: ID does not exist" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.649394 4912 scope.go:117] "RemoveContainer" containerID="e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.649764 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d\": container with ID starting with e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d not found: ID does not exist" containerID="e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.649792 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d"} err="failed to get container status \"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d\": rpc error: code = NotFound desc = could not find container \"e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d\": container with ID starting with e56afb66dd0bfea35df9ac11c1167677feee162ca14dc40d8c9aa150cd73d44d not found: ID does not exist" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.649809 4912 scope.go:117] "RemoveContainer" containerID="f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.650387 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563\": container with ID starting with f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563 not found: ID does not exist" containerID="f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.650406 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563"} err="failed to get container status \"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563\": rpc error: code = NotFound desc = could not find container \"f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563\": container with ID starting with f2f91dda4324ecf034549822048de07f45899fce1fdf66a6676764900edd3563 not found: ID does not exist" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.650421 4912 scope.go:117] "RemoveContainer" containerID="0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.650805 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653\": container with ID starting with 0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653 not found: ID does not exist" containerID="0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.650829 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653"} err="failed to get container status \"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653\": rpc error: code = NotFound desc = could not find container \"0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653\": container with ID starting with 0f27715e0587fbb92c06444f56c8dcd6b26db1aa887cd94560bb40edf7971653 not found: ID does not exist" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.681707 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.857160 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.884952 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.905253 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.905902 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="sg-core" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.905920 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="sg-core" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.905945 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-notification-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.905953 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-notification-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.905969 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-central-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.905975 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-central-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: E0318 13:29:00.906001 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="proxy-httpd" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.906007 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="proxy-httpd" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.906269 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="sg-core" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.906289 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="proxy-httpd" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.906302 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-central-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.906330 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" containerName="ceilometer-notification-agent" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.908971 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.912389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.912622 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.930476 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.992736 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.992860 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.992891 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.992986 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.993019 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7ww\" (UniqueName: \"kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.993085 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:00 crc kubenswrapper[4912]: I0318 13:29:00.993213 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095538 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095640 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095678 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095741 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095766 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7ww\" (UniqueName: \"kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095811 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.095916 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.096504 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.096620 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.100357 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.101854 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.102554 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.102606 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.120150 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7ww\" (UniqueName: \"kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww\") pod \"ceilometer-0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.237764 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.501817 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be757545-6411-4e1a-bd46-6cddf5a22d61","Type":"ContainerStarted","Data":"b66a003503ba051e7c6749a8368343fab1938444a7661f6f6ca1e74d685f5ad7"} Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.502281 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be757545-6411-4e1a-bd46-6cddf5a22d61","Type":"ContainerStarted","Data":"6528a71eac34e35951b0c1ee0e168513141c1b7bf05625b61ffa3e5ea426acdf"} Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.502354 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.520720 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.520699537 podStartE2EDuration="2.520699537s" podCreationTimestamp="2026-03-18 13:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:01.519333251 +0000 UTC m=+1589.978760676" watchObservedRunningTime="2026-03-18 13:29:01.520699537 +0000 UTC m=+1589.980126962" Mar 18 13:29:01 crc kubenswrapper[4912]: I0318 13:29:01.782081 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:02 crc kubenswrapper[4912]: I0318 13:29:02.243580 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec2821f-350b-4435-8f0e-65b6d5c5b3ce" path="/var/lib/kubelet/pods/8ec2821f-350b-4435-8f0e-65b6d5c5b3ce/volumes" Mar 18 13:29:02 crc kubenswrapper[4912]: I0318 13:29:02.533506 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerStarted","Data":"bcdd2e2c2aeb4ceef905e5b2b545bab0b861aee5ca2773ec7ef037e38ec60dc0"} Mar 18 13:29:02 crc kubenswrapper[4912]: I0318 13:29:02.993492 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:03 crc kubenswrapper[4912]: I0318 13:29:03.553390 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerStarted","Data":"19cb2ba536a1cf529f30c4ba31ab481ee8e35343206c731afe825547a84a4ecc"} Mar 18 13:29:03 crc kubenswrapper[4912]: I0318 13:29:03.553964 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerStarted","Data":"6359120c35c4dcf8625761ca0cdd19ab1b62024f8cfa18373d0dffb53ab18cf5"} Mar 18 13:29:04 crc kubenswrapper[4912]: I0318 13:29:04.573267 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerStarted","Data":"9d5b236431f5a9e92c9d5b6bf23286a9f56fc4f0b2466ef50823b235e1ed6d61"} Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.687066 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerStarted","Data":"9281d9f50dcb02146d941a05a398731ea96b15309e9b3483b58b7d1a9023d2c7"} Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.688004 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.687307 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-notification-agent" containerID="cri-o://19cb2ba536a1cf529f30c4ba31ab481ee8e35343206c731afe825547a84a4ecc" gracePeriod=30 Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.687276 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-central-agent" containerID="cri-o://6359120c35c4dcf8625761ca0cdd19ab1b62024f8cfa18373d0dffb53ab18cf5" gracePeriod=30 Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.687356 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="sg-core" containerID="cri-o://9d5b236431f5a9e92c9d5b6bf23286a9f56fc4f0b2466ef50823b235e1ed6d61" gracePeriod=30 Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.687351 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="proxy-httpd" containerID="cri-o://9281d9f50dcb02146d941a05a398731ea96b15309e9b3483b58b7d1a9023d2c7" gracePeriod=30 Mar 18 13:29:07 crc kubenswrapper[4912]: I0318 13:29:07.722512 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.050379615 podStartE2EDuration="7.722470386s" podCreationTimestamp="2026-03-18 13:29:00 +0000 UTC" firstStartedPulling="2026-03-18 13:29:01.793808978 +0000 UTC m=+1590.253236403" lastFinishedPulling="2026-03-18 13:29:06.465899749 +0000 UTC m=+1594.925327174" observedRunningTime="2026-03-18 13:29:07.710412562 +0000 UTC m=+1596.169840007" watchObservedRunningTime="2026-03-18 13:29:07.722470386 +0000 UTC m=+1596.181897821" Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.702585 4912 generic.go:334] "Generic (PLEG): container finished" podID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerID="9281d9f50dcb02146d941a05a398731ea96b15309e9b3483b58b7d1a9023d2c7" exitCode=0 Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.704085 4912 generic.go:334] "Generic (PLEG): container finished" podID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerID="9d5b236431f5a9e92c9d5b6bf23286a9f56fc4f0b2466ef50823b235e1ed6d61" exitCode=2 Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.704243 4912 generic.go:334] "Generic (PLEG): container finished" podID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerID="19cb2ba536a1cf529f30c4ba31ab481ee8e35343206c731afe825547a84a4ecc" exitCode=0 Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.702646 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerDied","Data":"9281d9f50dcb02146d941a05a398731ea96b15309e9b3483b58b7d1a9023d2c7"} Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.704470 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerDied","Data":"9d5b236431f5a9e92c9d5b6bf23286a9f56fc4f0b2466ef50823b235e1ed6d61"} Mar 18 13:29:08 crc kubenswrapper[4912]: I0318 13:29:08.704581 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerDied","Data":"19cb2ba536a1cf529f30c4ba31ab481ee8e35343206c731afe825547a84a4ecc"} Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.788298 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-b97r9"] Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.790659 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.808730 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-b97r9"] Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.950685 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-abab-account-create-update-zlj6m"] Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.956481 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.964201 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.964325 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 18 13:29:09 crc kubenswrapper[4912]: I0318 13:29:09.971773 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-abab-account-create-update-zlj6m"] Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.017153 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.018237 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v445f\" (UniqueName: \"kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.121624 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.122885 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklnk\" (UniqueName: \"kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.123521 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.122992 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.124203 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v445f\" (UniqueName: \"kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.153141 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v445f\" (UniqueName: \"kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f\") pod \"aodh-db-create-b97r9\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.230664 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pklnk\" (UniqueName: \"kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.230764 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.231995 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.251990 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklnk\" (UniqueName: \"kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk\") pod \"aodh-abab-account-create-update-zlj6m\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.294957 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.428262 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.611253 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lkdsg"] Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.615671 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.625273 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.625527 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.668691 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqkz\" (UniqueName: \"kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.668790 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.668849 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.668963 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.694985 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lkdsg"] Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.781214 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqkz\" (UniqueName: \"kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.781714 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.781915 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.788121 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.790475 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.803353 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.825063 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.827497 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqkz\" (UniqueName: \"kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz\") pod \"nova-cell0-cell-mapping-lkdsg\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.831829 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.836229 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.841817 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.894026 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.895903 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.895936 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.896068 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.896166 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfrl\" (UniqueName: \"kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:10 crc kubenswrapper[4912]: I0318 13:29:10.986153 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.005162 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.005433 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfrl\" (UniqueName: \"kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.016739 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.016823 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.017163 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.021557 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.024923 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.052918 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfrl\" (UniqueName: \"kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl\") pod \"nova-api-0\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.053452 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.060911 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.063202 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.065460 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.082276 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.082622 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.118961 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119020 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119174 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7w9\" (UniqueName: \"kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119215 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119237 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119286 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhfwx\" (UniqueName: \"kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.119350 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.196025 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.198815 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:11 crc kubenswrapper[4912]: W0318 13:29:11.222224 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4c77729_b8c8_4973_bd84_43b29765e681.slice/crio-cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e WatchSource:0}: Error finding container cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e: Status 404 returned error can't find the container with id cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.223703 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224166 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhfwx\" (UniqueName: \"kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224260 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224292 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224314 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224417 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7w9\" (UniqueName: \"kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224453 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.224474 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.231771 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.249424 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.260819 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.265243 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-abab-account-create-update-zlj6m"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.271987 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.272072 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.303732 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhfwx\" (UniqueName: \"kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx\") pod \"nova-scheduler-0\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.307436 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7w9\" (UniqueName: \"kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9\") pod \"nova-metadata-0\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.330815 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.334970 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.409124 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.449631 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.492867 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.550235 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.550390 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.551780 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.551982 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.552191 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.552234 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4k4\" (UniqueName: \"kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.675457 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.675601 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.675634 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4k4\" (UniqueName: \"kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.675890 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.675960 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.676153 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.677061 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.681437 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.696606 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.716640 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.724775 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.755263 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4k4\" (UniqueName: \"kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4\") pod \"dnsmasq-dns-568d7fd7cf-47t7b\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.812289 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.829073 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.897907 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.905904 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-abab-account-create-update-zlj6m" event={"ID":"c4c77729-b8c8-4973-bd84-43b29765e681","Type":"ContainerStarted","Data":"cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e"} Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.918743 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.920182 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.931944 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75zpb\" (UniqueName: \"kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.941717 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b97r9" event={"ID":"ab76845a-bb55-4956-9bc9-4066fe9d6d0f","Type":"ContainerStarted","Data":"1e63d8d1ccdb5dae4b0eb080b09ce737332e1747428f1a98b03b4f0556934164"} Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.949474 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.950011 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:11 crc kubenswrapper[4912]: I0318 13:29:11.952193 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-b97r9"] Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.055024 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75zpb\" (UniqueName: \"kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.055447 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.055688 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.065057 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.067303 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.091520 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75zpb\" (UniqueName: \"kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb\") pod \"nova-cell1-novncproxy-0\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.266376 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.359695 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lkdsg"] Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.541881 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:12 crc kubenswrapper[4912]: W0318 13:29:12.579185 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd547696f_3142_42a2_8f44_3aee1b48849c.slice/crio-21304a692e71b2b72f10e71d0aedeeab67bd5ce453457b327a2cf7b0fa311ed2 WatchSource:0}: Error finding container 21304a692e71b2b72f10e71d0aedeeab67bd5ce453457b327a2cf7b0fa311ed2: Status 404 returned error can't find the container with id 21304a692e71b2b72f10e71d0aedeeab67bd5ce453457b327a2cf7b0fa311ed2 Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.714410 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.728366 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:12 crc kubenswrapper[4912]: W0318 13:29:12.768268 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cfe2a1_f3d3_4ece_aa38_2c1bd14c3071.slice/crio-06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb WatchSource:0}: Error finding container 06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb: Status 404 returned error can't find the container with id 06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb Mar 18 13:29:12 crc kubenswrapper[4912]: W0318 13:29:12.769207 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f8d91c_a434_4964_9581_bda0cf40119e.slice/crio-9d30497b3dbfde15dc5ed0813328679138aaee9a05a90b9b7af52573d937a6b4 WatchSource:0}: Error finding container 9d30497b3dbfde15dc5ed0813328679138aaee9a05a90b9b7af52573d937a6b4: Status 404 returned error can't find the container with id 9d30497b3dbfde15dc5ed0813328679138aaee9a05a90b9b7af52573d937a6b4 Mar 18 13:29:12 crc kubenswrapper[4912]: E0318 13:29:12.818304 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab76845a_bb55_4956_9bc9_4066fe9d6d0f.slice/crio-b80e16972749a07a0dc993e0d46563eb439416c41a6a1a9335cb4aaeb8f4fd8a.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.821428 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tqk4j"] Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.824428 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.834563 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.835339 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.866820 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tqk4j"] Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.938328 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.939112 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q647\" (UniqueName: \"kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.939179 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.939226 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.993127 4912 generic.go:334] "Generic (PLEG): container finished" podID="ab76845a-bb55-4956-9bc9-4066fe9d6d0f" containerID="b80e16972749a07a0dc993e0d46563eb439416c41a6a1a9335cb4aaeb8f4fd8a" exitCode=0 Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.993630 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b97r9" event={"ID":"ab76845a-bb55-4956-9bc9-4066fe9d6d0f","Type":"ContainerDied","Data":"b80e16972749a07a0dc993e0d46563eb439416c41a6a1a9335cb4aaeb8f4fd8a"} Mar 18 13:29:12 crc kubenswrapper[4912]: I0318 13:29:12.994408 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.007964 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerStarted","Data":"9d30497b3dbfde15dc5ed0813328679138aaee9a05a90b9b7af52573d937a6b4"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.016607 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lkdsg" event={"ID":"7e3e3ac4-e8a9-473c-96cb-479132a1882d","Type":"ContainerStarted","Data":"50d70d7b2299d03ca2093efe5d684b0402017cb72f675664bf60c42241e1afec"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.028214 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" event={"ID":"e93b7e52-0a4c-4372-a747-43fc785c0990","Type":"ContainerStarted","Data":"76d8209de6757f1e848e1870313ee04b5895ff697bb8c59ae8bc36362a7ad02a"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.031483 4912 generic.go:334] "Generic (PLEG): container finished" podID="c4c77729-b8c8-4973-bd84-43b29765e681" containerID="f731d42d55b0d10c0a20f3b599df01cc52356b3edbab12131b34da069e7e683a" exitCode=0 Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.031674 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-abab-account-create-update-zlj6m" event={"ID":"c4c77729-b8c8-4973-bd84-43b29765e681","Type":"ContainerDied","Data":"f731d42d55b0d10c0a20f3b599df01cc52356b3edbab12131b34da069e7e683a"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.043368 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.043762 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q647\" (UniqueName: \"kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.043872 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.043950 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.050738 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.051100 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerStarted","Data":"21304a692e71b2b72f10e71d0aedeeab67bd5ce453457b327a2cf7b0fa311ed2"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.053800 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lkdsg" podStartSLOduration=3.053773309 podStartE2EDuration="3.053773309s" podCreationTimestamp="2026-03-18 13:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:13.047131701 +0000 UTC m=+1601.506559126" watchObservedRunningTime="2026-03-18 13:29:13.053773309 +0000 UTC m=+1601.513200724" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.055305 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.056799 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.078394 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071","Type":"ContainerStarted","Data":"06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb"} Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.086320 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q647\" (UniqueName: \"kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647\") pod \"nova-cell1-conductor-db-sync-tqk4j\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.189234 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.373910 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:13 crc kubenswrapper[4912]: W0318 13:29:13.971246 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0531362c_01f6_463c_8217_e78b33f55630.slice/crio-47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181 WatchSource:0}: Error finding container 47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181: Status 404 returned error can't find the container with id 47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181 Mar 18 13:29:13 crc kubenswrapper[4912]: I0318 13:29:13.973219 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tqk4j"] Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.102237 4912 generic.go:334] "Generic (PLEG): container finished" podID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerID="adf9f7a50dfe50e2deaf98dc66900e042b0e6803b1f47ee8dd5c7421a17600f9" exitCode=0 Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.103076 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" event={"ID":"e93b7e52-0a4c-4372-a747-43fc785c0990","Type":"ContainerDied","Data":"adf9f7a50dfe50e2deaf98dc66900e042b0e6803b1f47ee8dd5c7421a17600f9"} Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.114348 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"37e6287a-9a5f-44a2-b9ca-a855075fd554","Type":"ContainerStarted","Data":"e387ef6c10c90a60a8702de8c702c479b11a27cc5a7cb90f7897c519403e8fab"} Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.121136 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" event={"ID":"0531362c-01f6-463c-8217-e78b33f55630","Type":"ContainerStarted","Data":"47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181"} Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.155580 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lkdsg" event={"ID":"7e3e3ac4-e8a9-473c-96cb-479132a1882d","Type":"ContainerStarted","Data":"a17a15307628daee1060fc12806a4ac94691a1855194e5d949370fc0ecd59cd1"} Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.743764 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.786109 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.862538 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts\") pod \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.863264 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pklnk\" (UniqueName: \"kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk\") pod \"c4c77729-b8c8-4973-bd84-43b29765e681\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.863369 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v445f\" (UniqueName: \"kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f\") pod \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\" (UID: \"ab76845a-bb55-4956-9bc9-4066fe9d6d0f\") " Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.863416 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts\") pod \"c4c77729-b8c8-4973-bd84-43b29765e681\" (UID: \"c4c77729-b8c8-4973-bd84-43b29765e681\") " Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.864278 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab76845a-bb55-4956-9bc9-4066fe9d6d0f" (UID: "ab76845a-bb55-4956-9bc9-4066fe9d6d0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.866040 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4c77729-b8c8-4973-bd84-43b29765e681" (UID: "c4c77729-b8c8-4973-bd84-43b29765e681"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.867779 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.867818 4912 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c77729-b8c8-4973-bd84-43b29765e681-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.872667 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk" (OuterVolumeSpecName: "kube-api-access-pklnk") pod "c4c77729-b8c8-4973-bd84-43b29765e681" (UID: "c4c77729-b8c8-4973-bd84-43b29765e681"). InnerVolumeSpecName "kube-api-access-pklnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.873310 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f" (OuterVolumeSpecName: "kube-api-access-v445f") pod "ab76845a-bb55-4956-9bc9-4066fe9d6d0f" (UID: "ab76845a-bb55-4956-9bc9-4066fe9d6d0f"). InnerVolumeSpecName "kube-api-access-v445f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.971213 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pklnk\" (UniqueName: \"kubernetes.io/projected/c4c77729-b8c8-4973-bd84-43b29765e681-kube-api-access-pklnk\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:14 crc kubenswrapper[4912]: I0318 13:29:14.971250 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v445f\" (UniqueName: \"kubernetes.io/projected/ab76845a-bb55-4956-9bc9-4066fe9d6d0f-kube-api-access-v445f\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.293604 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b97r9" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.295025 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b97r9" event={"ID":"ab76845a-bb55-4956-9bc9-4066fe9d6d0f","Type":"ContainerDied","Data":"1e63d8d1ccdb5dae4b0eb080b09ce737332e1747428f1a98b03b4f0556934164"} Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.295179 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e63d8d1ccdb5dae4b0eb080b09ce737332e1747428f1a98b03b4f0556934164" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.295249 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.310211 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.312560 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" event={"ID":"e93b7e52-0a4c-4372-a747-43fc785c0990","Type":"ContainerStarted","Data":"7ece0b5f58e6c98123171222e8d867e616adce174afc06b66271d954c4e6a6ce"} Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.312827 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.317960 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-abab-account-create-update-zlj6m" event={"ID":"c4c77729-b8c8-4973-bd84-43b29765e681","Type":"ContainerDied","Data":"cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e"} Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.318014 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb104d31b90e57ba683bd5108528d078d0132b3367b95076491534968c52d1e" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.318129 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-abab-account-create-update-zlj6m" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.324408 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" event={"ID":"0531362c-01f6-463c-8217-e78b33f55630","Type":"ContainerStarted","Data":"cf6f8672081af8c8ecf267d12292a6b48a19daaa698d77f80325111109309621"} Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.350098 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" podStartSLOduration=4.350045201 podStartE2EDuration="4.350045201s" podCreationTimestamp="2026-03-18 13:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:15.338301646 +0000 UTC m=+1603.797729081" watchObservedRunningTime="2026-03-18 13:29:15.350045201 +0000 UTC m=+1603.809472626" Mar 18 13:29:15 crc kubenswrapper[4912]: I0318 13:29:15.368445 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" podStartSLOduration=3.368420864 podStartE2EDuration="3.368420864s" podCreationTimestamp="2026-03-18 13:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:15.366421981 +0000 UTC m=+1603.825849406" watchObservedRunningTime="2026-03-18 13:29:15.368420864 +0000 UTC m=+1603.827848289" Mar 18 13:29:16 crc kubenswrapper[4912]: I0318 13:29:16.348171 4912 generic.go:334] "Generic (PLEG): container finished" podID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerID="6359120c35c4dcf8625761ca0cdd19ab1b62024f8cfa18373d0dffb53ab18cf5" exitCode=0 Mar 18 13:29:16 crc kubenswrapper[4912]: I0318 13:29:16.348337 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerDied","Data":"6359120c35c4dcf8625761ca0cdd19ab1b62024f8cfa18373d0dffb53ab18cf5"} Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.049982 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.184891 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.184967 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7ww\" (UniqueName: \"kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185148 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185312 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185338 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185487 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185518 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd\") pod \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\" (UID: \"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0\") " Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.185739 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.186042 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.186515 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.186540 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.192018 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww" (OuterVolumeSpecName: "kube-api-access-jt7ww") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "kube-api-access-jt7ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.204637 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts" (OuterVolumeSpecName: "scripts") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.234783 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.283486 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.289392 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.289424 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.289434 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.289444 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7ww\" (UniqueName: \"kubernetes.io/projected/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-kube-api-access-jt7ww\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.308624 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data" (OuterVolumeSpecName: "config-data") pod "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" (UID: "b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.371530 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0","Type":"ContainerDied","Data":"bcdd2e2c2aeb4ceef905e5b2b545bab0b861aee5ca2773ec7ef037e38ec60dc0"} Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.371593 4912 scope.go:117] "RemoveContainer" containerID="9281d9f50dcb02146d941a05a398731ea96b15309e9b3483b58b7d1a9023d2c7" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.371606 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.392610 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.456038 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.481548 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.495510 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496183 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-notification-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496206 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-notification-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496235 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-central-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496246 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-central-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496267 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="proxy-httpd" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496273 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="proxy-httpd" Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496289 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="sg-core" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496297 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="sg-core" Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496321 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c77729-b8c8-4973-bd84-43b29765e681" containerName="mariadb-account-create-update" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496328 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c77729-b8c8-4973-bd84-43b29765e681" containerName="mariadb-account-create-update" Mar 18 13:29:17 crc kubenswrapper[4912]: E0318 13:29:17.496365 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab76845a-bb55-4956-9bc9-4066fe9d6d0f" containerName="mariadb-database-create" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496375 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab76845a-bb55-4956-9bc9-4066fe9d6d0f" containerName="mariadb-database-create" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496615 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-central-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496630 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="sg-core" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496644 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="proxy-httpd" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496668 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" containerName="ceilometer-notification-agent" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496676 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab76845a-bb55-4956-9bc9-4066fe9d6d0f" containerName="mariadb-database-create" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.496688 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c77729-b8c8-4973-bd84-43b29765e681" containerName="mariadb-account-create-update" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.499797 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.503319 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.503661 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.512479 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.572960 4912 scope.go:117] "RemoveContainer" containerID="9d5b236431f5a9e92c9d5b6bf23286a9f56fc4f0b2466ef50823b235e1ed6d61" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.598733 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.598804 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.598905 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.598942 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.598988 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.599040 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.599793 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg55\" (UniqueName: \"kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.635514 4912 scope.go:117] "RemoveContainer" containerID="19cb2ba536a1cf529f30c4ba31ab481ee8e35343206c731afe825547a84a4ecc" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.678834 4912 scope.go:117] "RemoveContainer" containerID="6359120c35c4dcf8625761ca0cdd19ab1b62024f8cfa18373d0dffb53ab18cf5" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704223 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704291 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704345 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704402 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704511 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg55\" (UniqueName: \"kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704599 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.704630 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.705697 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.705892 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.709400 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.710133 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.710467 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.710791 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.727027 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg55\" (UniqueName: \"kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55\") pod \"ceilometer-0\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " pod="openstack/ceilometer-0" Mar 18 13:29:17 crc kubenswrapper[4912]: I0318 13:29:17.830366 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.245962 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0" path="/var/lib/kubelet/pods/b7321ee5-911d-4e3b-98ae-6f8ad2d6c0b0/volumes" Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.389747 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.401391 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerStarted","Data":"029f8769a540eb19df9572402489f224a63114556326f1fd231fcc4338519bdf"} Mar 18 13:29:18 crc kubenswrapper[4912]: W0318 13:29:18.407853 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481be8f3_c3b2_4c54_9488_d1f710e706f3.slice/crio-cc2d81ff98d1b26ed673b7e44770b4836cd9c1bc0c3170330f9b1b59dc4b5fda WatchSource:0}: Error finding container cc2d81ff98d1b26ed673b7e44770b4836cd9c1bc0c3170330f9b1b59dc4b5fda: Status 404 returned error can't find the container with id cc2d81ff98d1b26ed673b7e44770b4836cd9c1bc0c3170330f9b1b59dc4b5fda Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.408157 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerStarted","Data":"01294588a82de26436ee31d92bb7321279d3293c00d6180614bd09cb2b311070"} Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.414668 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071","Type":"ContainerStarted","Data":"15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d"} Mar 18 13:29:18 crc kubenswrapper[4912]: I0318 13:29:18.443759 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.592046307 podStartE2EDuration="8.443711107s" podCreationTimestamp="2026-03-18 13:29:10 +0000 UTC" firstStartedPulling="2026-03-18 13:29:12.797739507 +0000 UTC m=+1601.257166932" lastFinishedPulling="2026-03-18 13:29:17.649404307 +0000 UTC m=+1606.108831732" observedRunningTime="2026-03-18 13:29:18.433992446 +0000 UTC m=+1606.893419891" watchObservedRunningTime="2026-03-18 13:29:18.443711107 +0000 UTC m=+1606.903138532" Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.431094 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerStarted","Data":"b53b193e258af644c705fdf6bc4884ae21842db04dbff6c962faf3a483523094"} Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.431277 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-metadata" containerID="cri-o://b53b193e258af644c705fdf6bc4884ae21842db04dbff6c962faf3a483523094" gracePeriod=30 Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.431202 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-log" containerID="cri-o://029f8769a540eb19df9572402489f224a63114556326f1fd231fcc4338519bdf" gracePeriod=30 Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.433852 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerStarted","Data":"cc2d81ff98d1b26ed673b7e44770b4836cd9c1bc0c3170330f9b1b59dc4b5fda"} Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.451508 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerStarted","Data":"b2abd7423f9c0a8d672516a42218b1d0677e070950216e388718ea29420aa5fd"} Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.455254 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="37e6287a-9a5f-44a2-b9ca-a855075fd554" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80" gracePeriod=30 Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.455367 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"37e6287a-9a5f-44a2-b9ca-a855075fd554","Type":"ContainerStarted","Data":"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80"} Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.482965 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.635574254 podStartE2EDuration="9.482939029s" podCreationTimestamp="2026-03-18 13:29:10 +0000 UTC" firstStartedPulling="2026-03-18 13:29:12.797527151 +0000 UTC m=+1601.256954576" lastFinishedPulling="2026-03-18 13:29:17.644891936 +0000 UTC m=+1606.104319351" observedRunningTime="2026-03-18 13:29:19.479206309 +0000 UTC m=+1607.938633744" watchObservedRunningTime="2026-03-18 13:29:19.482939029 +0000 UTC m=+1607.942366454" Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.531888 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.239205025 podStartE2EDuration="8.531858282s" podCreationTimestamp="2026-03-18 13:29:11 +0000 UTC" firstStartedPulling="2026-03-18 13:29:13.362098634 +0000 UTC m=+1601.821526059" lastFinishedPulling="2026-03-18 13:29:17.654751891 +0000 UTC m=+1606.114179316" observedRunningTime="2026-03-18 13:29:19.519348036 +0000 UTC m=+1607.978775471" watchObservedRunningTime="2026-03-18 13:29:19.531858282 +0000 UTC m=+1607.991285707" Mar 18 13:29:19 crc kubenswrapper[4912]: I0318 13:29:19.576151 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.527603457 podStartE2EDuration="9.576126781s" podCreationTimestamp="2026-03-18 13:29:10 +0000 UTC" firstStartedPulling="2026-03-18 13:29:12.596332141 +0000 UTC m=+1601.055759566" lastFinishedPulling="2026-03-18 13:29:17.644855465 +0000 UTC m=+1606.104282890" observedRunningTime="2026-03-18 13:29:19.55114058 +0000 UTC m=+1608.010568015" watchObservedRunningTime="2026-03-18 13:29:19.576126781 +0000 UTC m=+1608.035554206" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.302305 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-nk4hg"] Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.305113 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-nk4hg"] Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.305231 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.308854 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d7sqs" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.309305 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.309748 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.309910 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.430591 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.431122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.431173 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.431209 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8s9\" (UniqueName: \"kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.477275 4912 generic.go:334] "Generic (PLEG): container finished" podID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerID="b53b193e258af644c705fdf6bc4884ae21842db04dbff6c962faf3a483523094" exitCode=0 Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.477315 4912 generic.go:334] "Generic (PLEG): container finished" podID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerID="029f8769a540eb19df9572402489f224a63114556326f1fd231fcc4338519bdf" exitCode=143 Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.477356 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerDied","Data":"b53b193e258af644c705fdf6bc4884ae21842db04dbff6c962faf3a483523094"} Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.477389 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerDied","Data":"029f8769a540eb19df9572402489f224a63114556326f1fd231fcc4338519bdf"} Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.479876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerStarted","Data":"9254227e5b22f2b583f7fa29aeb0a107f3fdfde7a15bd360d4dade3513bdaf3d"} Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.546878 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.547064 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.547174 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8s9\" (UniqueName: \"kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.547283 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.559297 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.559756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.580971 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.631142 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8s9\" (UniqueName: \"kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9\") pod \"aodh-db-sync-nk4hg\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.698754 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.781464 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.860529 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data\") pod \"a6f8d91c-a434-4964-9581-bda0cf40119e\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.860652 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs\") pod \"a6f8d91c-a434-4964-9581-bda0cf40119e\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.860680 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle\") pod \"a6f8d91c-a434-4964-9581-bda0cf40119e\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.860877 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w7w9\" (UniqueName: \"kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9\") pod \"a6f8d91c-a434-4964-9581-bda0cf40119e\" (UID: \"a6f8d91c-a434-4964-9581-bda0cf40119e\") " Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.868067 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs" (OuterVolumeSpecName: "logs") pod "a6f8d91c-a434-4964-9581-bda0cf40119e" (UID: "a6f8d91c-a434-4964-9581-bda0cf40119e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.872442 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9" (OuterVolumeSpecName: "kube-api-access-8w7w9") pod "a6f8d91c-a434-4964-9581-bda0cf40119e" (UID: "a6f8d91c-a434-4964-9581-bda0cf40119e"). InnerVolumeSpecName "kube-api-access-8w7w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.933358 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6f8d91c-a434-4964-9581-bda0cf40119e" (UID: "a6f8d91c-a434-4964-9581-bda0cf40119e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.936288 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data" (OuterVolumeSpecName: "config-data") pod "a6f8d91c-a434-4964-9581-bda0cf40119e" (UID: "a6f8d91c-a434-4964-9581-bda0cf40119e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.966433 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.966703 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f8d91c-a434-4964-9581-bda0cf40119e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.966788 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f8d91c-a434-4964-9581-bda0cf40119e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:20 crc kubenswrapper[4912]: I0318 13:29:20.966905 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w7w9\" (UniqueName: \"kubernetes.io/projected/a6f8d91c-a434-4964-9581-bda0cf40119e-kube-api-access-8w7w9\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.200340 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.200680 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.496534 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6f8d91c-a434-4964-9581-bda0cf40119e","Type":"ContainerDied","Data":"9d30497b3dbfde15dc5ed0813328679138aaee9a05a90b9b7af52573d937a6b4"} Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.497117 4912 scope.go:117] "RemoveContainer" containerID="b53b193e258af644c705fdf6bc4884ae21842db04dbff6c962faf3a483523094" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.497208 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.498082 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.498115 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.508367 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerStarted","Data":"e891f274822d69f2a65aaf57003f10c28a7a4e8758e170a32ed2956d883fc43f"} Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.529470 4912 scope.go:117] "RemoveContainer" containerID="029f8769a540eb19df9572402489f224a63114556326f1fd231fcc4338519bdf" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.548407 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-nk4hg"] Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.575255 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.591508 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.601250 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.630724 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:21 crc kubenswrapper[4912]: E0318 13:29:21.631734 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-metadata" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.631766 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-metadata" Mar 18 13:29:21 crc kubenswrapper[4912]: E0318 13:29:21.631826 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-log" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.631837 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-log" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.632339 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-log" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.632364 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" containerName="nova-metadata-metadata" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.654274 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.658019 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.658293 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.665134 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.726794 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.726854 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.727535 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.727781 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfc6h\" (UniqueName: \"kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.727959 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.831365 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfc6h\" (UniqueName: \"kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.831462 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.831566 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.831587 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.831687 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.833220 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.842247 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.859278 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.862680 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.862945 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfc6h\" (UniqueName: \"kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h\") pod \"nova-metadata-0\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " pod="openstack/nova-metadata-0" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.925213 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:29:21 crc kubenswrapper[4912]: I0318 13:29:21.995990 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.043323 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.043689 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="dnsmasq-dns" containerID="cri-o://b8aec3cb31cff2d6421d2242c7bd7db74e0a2f4848f3998afd72e1bdd93798cd" gracePeriod=10 Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.286151 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f8d91c-a434-4964-9581-bda0cf40119e" path="/var/lib/kubelet/pods/a6f8d91c-a434-4964-9581-bda0cf40119e/volumes" Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.287766 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.288451 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.288805 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.573778 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerID="b8aec3cb31cff2d6421d2242c7bd7db74e0a2f4848f3998afd72e1bdd93798cd" exitCode=0 Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.573858 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" event={"ID":"6eca8a13-092c-4ab7-8c93-a91e352f2ad0","Type":"ContainerDied","Data":"b8aec3cb31cff2d6421d2242c7bd7db74e0a2f4848f3998afd72e1bdd93798cd"} Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.623974 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerStarted","Data":"0f42fbac48d34c4d44ae825497153e86acb258eca1cbf6882960d308f795eba2"} Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.635964 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nk4hg" event={"ID":"2a703388-f5f1-4975-9c2c-5ac152798930","Type":"ContainerStarted","Data":"c191008c0a9b53ac9b4e9da84e27c308ddd2cebf7ab621559080ba449fce5ff0"} Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.686928 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:22 crc kubenswrapper[4912]: I0318 13:29:22.879180 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.243685 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324340 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324428 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324630 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtvfz\" (UniqueName: \"kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324707 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324775 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.324818 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0\") pod \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\" (UID: \"6eca8a13-092c-4ab7-8c93-a91e352f2ad0\") " Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.333722 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz" (OuterVolumeSpecName: "kube-api-access-mtvfz") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "kube-api-access-mtvfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.432624 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtvfz\" (UniqueName: \"kubernetes.io/projected/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-kube-api-access-mtvfz\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.460511 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.475190 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.535158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config" (OuterVolumeSpecName: "config") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.536622 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.536648 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.536659 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.551845 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.571901 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6eca8a13-092c-4ab7-8c93-a91e352f2ad0" (UID: "6eca8a13-092c-4ab7-8c93-a91e352f2ad0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.640254 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.640292 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6eca8a13-092c-4ab7-8c93-a91e352f2ad0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.679191 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerStarted","Data":"c923e19527ae885588b71db706db252aa6c8db209f1faa1d8d94e9642a490057"} Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.679711 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerStarted","Data":"d20110b256b10c97e8ba7532bfeb68fca8eb2b6acb8749d900a4d9fde47b40a9"} Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.691555 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.692498 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" event={"ID":"6eca8a13-092c-4ab7-8c93-a91e352f2ad0","Type":"ContainerDied","Data":"1680f82c93e174f4a687deddc1ffd499d1f91f2e80d8f6ed8954cff10563c443"} Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.692554 4912 scope.go:117] "RemoveContainer" containerID="b8aec3cb31cff2d6421d2242c7bd7db74e0a2f4848f3998afd72e1bdd93798cd" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.772862 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.785973 4912 scope.go:117] "RemoveContainer" containerID="b9057db229a35c2f95d9a2edbb01db085519827120f7d1cd65dc62c82fb312b6" Mar 18 13:29:23 crc kubenswrapper[4912]: I0318 13:29:23.790587 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bcs29"] Mar 18 13:29:24 crc kubenswrapper[4912]: I0318 13:29:24.251857 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" path="/var/lib/kubelet/pods/6eca8a13-092c-4ab7-8c93-a91e352f2ad0/volumes" Mar 18 13:29:24 crc kubenswrapper[4912]: I0318 13:29:24.714593 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerStarted","Data":"3c95e0da09b94270afa42b8cde2bd69cd5e1e7b82c1ef3998b408722f1bb0265"} Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.734133 4912 generic.go:334] "Generic (PLEG): container finished" podID="0531362c-01f6-463c-8217-e78b33f55630" containerID="cf6f8672081af8c8ecf267d12292a6b48a19daaa698d77f80325111109309621" exitCode=0 Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.734219 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" event={"ID":"0531362c-01f6-463c-8217-e78b33f55630","Type":"ContainerDied","Data":"cf6f8672081af8c8ecf267d12292a6b48a19daaa698d77f80325111109309621"} Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.736542 4912 generic.go:334] "Generic (PLEG): container finished" podID="7e3e3ac4-e8a9-473c-96cb-479132a1882d" containerID="a17a15307628daee1060fc12806a4ac94691a1855194e5d949370fc0ecd59cd1" exitCode=0 Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.736629 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lkdsg" event={"ID":"7e3e3ac4-e8a9-473c-96cb-479132a1882d","Type":"ContainerDied","Data":"a17a15307628daee1060fc12806a4ac94691a1855194e5d949370fc0ecd59cd1"} Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.739239 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerStarted","Data":"57e0739129e4a6ae5899001611fc239450a00969b77839224c062e192482ad01"} Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.763137 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.763107182 podStartE2EDuration="4.763107182s" podCreationTimestamp="2026-03-18 13:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:24.747733528 +0000 UTC m=+1613.207160963" watchObservedRunningTime="2026-03-18 13:29:25.763107182 +0000 UTC m=+1614.222534607" Mar 18 13:29:25 crc kubenswrapper[4912]: I0318 13:29:25.808711 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.919755855 podStartE2EDuration="8.808684405s" podCreationTimestamp="2026-03-18 13:29:17 +0000 UTC" firstStartedPulling="2026-03-18 13:29:18.413989379 +0000 UTC m=+1606.873416804" lastFinishedPulling="2026-03-18 13:29:24.302917929 +0000 UTC m=+1612.762345354" observedRunningTime="2026-03-18 13:29:25.803758393 +0000 UTC m=+1614.263185828" watchObservedRunningTime="2026-03-18 13:29:25.808684405 +0000 UTC m=+1614.268111830" Mar 18 13:29:26 crc kubenswrapper[4912]: I0318 13:29:26.764323 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:29:27 crc kubenswrapper[4912]: I0318 13:29:27.643974 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688b9f5b49-bcs29" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.220:5353: i/o timeout" Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.792263 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lkdsg" event={"ID":"7e3e3ac4-e8a9-473c-96cb-479132a1882d","Type":"ContainerDied","Data":"50d70d7b2299d03ca2093efe5d684b0402017cb72f675664bf60c42241e1afec"} Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.792316 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d70d7b2299d03ca2093efe5d684b0402017cb72f675664bf60c42241e1afec" Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.796076 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" event={"ID":"0531362c-01f6-463c-8217-e78b33f55630","Type":"ContainerDied","Data":"47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181"} Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.796212 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47046dd385c6a5167adc987dbdb4d42962f580543d12f4100876213aa99d4181" Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.984104 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:28 crc kubenswrapper[4912]: I0318 13:29:28.995938 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105260 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle\") pod \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105665 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkqkz\" (UniqueName: \"kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz\") pod \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105795 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts\") pod \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105833 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data\") pod \"0531362c-01f6-463c-8217-e78b33f55630\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105886 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle\") pod \"0531362c-01f6-463c-8217-e78b33f55630\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.105990 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data\") pod \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\" (UID: \"7e3e3ac4-e8a9-473c-96cb-479132a1882d\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.106103 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q647\" (UniqueName: \"kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647\") pod \"0531362c-01f6-463c-8217-e78b33f55630\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.106216 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts\") pod \"0531362c-01f6-463c-8217-e78b33f55630\" (UID: \"0531362c-01f6-463c-8217-e78b33f55630\") " Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.120803 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts" (OuterVolumeSpecName: "scripts") pod "0531362c-01f6-463c-8217-e78b33f55630" (UID: "0531362c-01f6-463c-8217-e78b33f55630"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.140751 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts" (OuterVolumeSpecName: "scripts") pod "7e3e3ac4-e8a9-473c-96cb-479132a1882d" (UID: "7e3e3ac4-e8a9-473c-96cb-479132a1882d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.142539 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647" (OuterVolumeSpecName: "kube-api-access-4q647") pod "0531362c-01f6-463c-8217-e78b33f55630" (UID: "0531362c-01f6-463c-8217-e78b33f55630"). InnerVolumeSpecName "kube-api-access-4q647". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.145777 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz" (OuterVolumeSpecName: "kube-api-access-hkqkz") pod "7e3e3ac4-e8a9-473c-96cb-479132a1882d" (UID: "7e3e3ac4-e8a9-473c-96cb-479132a1882d"). InnerVolumeSpecName "kube-api-access-hkqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.150378 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data" (OuterVolumeSpecName: "config-data") pod "7e3e3ac4-e8a9-473c-96cb-479132a1882d" (UID: "7e3e3ac4-e8a9-473c-96cb-479132a1882d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.152008 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e3e3ac4-e8a9-473c-96cb-479132a1882d" (UID: "7e3e3ac4-e8a9-473c-96cb-479132a1882d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.154970 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0531362c-01f6-463c-8217-e78b33f55630" (UID: "0531362c-01f6-463c-8217-e78b33f55630"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.188235 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data" (OuterVolumeSpecName: "config-data") pod "0531362c-01f6-463c-8217-e78b33f55630" (UID: "0531362c-01f6-463c-8217-e78b33f55630"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.199994 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.200084 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209700 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209746 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209764 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209785 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209798 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q647\" (UniqueName: \"kubernetes.io/projected/0531362c-01f6-463c-8217-e78b33f55630-kube-api-access-4q647\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209813 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531362c-01f6-463c-8217-e78b33f55630-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209823 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3e3ac4-e8a9-473c-96cb-479132a1882d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.209832 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkqkz\" (UniqueName: \"kubernetes.io/projected/7e3e3ac4-e8a9-473c-96cb-479132a1882d-kube-api-access-hkqkz\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.822738 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lkdsg" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.822739 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nk4hg" event={"ID":"2a703388-f5f1-4975-9c2c-5ac152798930","Type":"ContainerStarted","Data":"d855dc2b8f4b2b41840914014e8856f43312b10cfe76c56df63f499bd087d986"} Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.822847 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tqk4j" Mar 18 13:29:29 crc kubenswrapper[4912]: I0318 13:29:29.853823 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-nk4hg" podStartSLOduration=2.6963722690000003 podStartE2EDuration="9.853789526s" podCreationTimestamp="2026-03-18 13:29:20 +0000 UTC" firstStartedPulling="2026-03-18 13:29:21.614412559 +0000 UTC m=+1610.073839984" lastFinishedPulling="2026-03-18 13:29:28.771829816 +0000 UTC m=+1617.231257241" observedRunningTime="2026-03-18 13:29:29.847796465 +0000 UTC m=+1618.307223900" watchObservedRunningTime="2026-03-18 13:29:29.853789526 +0000 UTC m=+1618.313216951" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.107839 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:29:30 crc kubenswrapper[4912]: E0318 13:29:30.108723 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3e3ac4-e8a9-473c-96cb-479132a1882d" containerName="nova-manage" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.108746 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3e3ac4-e8a9-473c-96cb-479132a1882d" containerName="nova-manage" Mar 18 13:29:30 crc kubenswrapper[4912]: E0318 13:29:30.108766 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531362c-01f6-463c-8217-e78b33f55630" containerName="nova-cell1-conductor-db-sync" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.108773 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531362c-01f6-463c-8217-e78b33f55630" containerName="nova-cell1-conductor-db-sync" Mar 18 13:29:30 crc kubenswrapper[4912]: E0318 13:29:30.108791 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="dnsmasq-dns" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.108798 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="dnsmasq-dns" Mar 18 13:29:30 crc kubenswrapper[4912]: E0318 13:29:30.108813 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="init" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.108820 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="init" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.109146 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eca8a13-092c-4ab7-8c93-a91e352f2ad0" containerName="dnsmasq-dns" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.109174 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3e3ac4-e8a9-473c-96cb-479132a1882d" containerName="nova-manage" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.109189 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="0531362c-01f6-463c-8217-e78b33f55630" containerName="nova-cell1-conductor-db-sync" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.110852 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.120620 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.140590 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.141330 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.141397 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfl96\" (UniqueName: \"kubernetes.io/projected/78b0bbd2-0094-4cf8-b5ae-624dde267b12-kube-api-access-zfl96\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.141829 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.247136 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.247206 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfl96\" (UniqueName: \"kubernetes.io/projected/78b0bbd2-0094-4cf8-b5ae-624dde267b12-kube-api-access-zfl96\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.247368 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.273569 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.278715 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfl96\" (UniqueName: \"kubernetes.io/projected/78b0bbd2-0094-4cf8-b5ae-624dde267b12-kube-api-access-zfl96\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.300028 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b0bbd2-0094-4cf8-b5ae-624dde267b12-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78b0bbd2-0094-4cf8-b5ae-624dde267b12\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.306124 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.306791 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-log" containerID="cri-o://01294588a82de26436ee31d92bb7321279d3293c00d6180614bd09cb2b311070" gracePeriod=30 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.307562 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-api" containerID="cri-o://b2abd7423f9c0a8d672516a42218b1d0677e070950216e388718ea29420aa5fd" gracePeriod=30 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.340237 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.342525 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerName="nova-scheduler-scheduler" containerID="cri-o://15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" gracePeriod=30 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.371368 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.371741 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-log" containerID="cri-o://c923e19527ae885588b71db706db252aa6c8db209f1faa1d8d94e9642a490057" gracePeriod=30 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.371916 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-metadata" containerID="cri-o://3c95e0da09b94270afa42b8cde2bd69cd5e1e7b82c1ef3998b408722f1bb0265" gracePeriod=30 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.448573 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.843657 4912 generic.go:334] "Generic (PLEG): container finished" podID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerID="3c95e0da09b94270afa42b8cde2bd69cd5e1e7b82c1ef3998b408722f1bb0265" exitCode=0 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.844075 4912 generic.go:334] "Generic (PLEG): container finished" podID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerID="c923e19527ae885588b71db706db252aa6c8db209f1faa1d8d94e9642a490057" exitCode=143 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.843766 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerDied","Data":"3c95e0da09b94270afa42b8cde2bd69cd5e1e7b82c1ef3998b408722f1bb0265"} Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.844170 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerDied","Data":"c923e19527ae885588b71db706db252aa6c8db209f1faa1d8d94e9642a490057"} Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.848706 4912 generic.go:334] "Generic (PLEG): container finished" podID="d547696f-3142-42a2-8f44-3aee1b48849c" containerID="01294588a82de26436ee31d92bb7321279d3293c00d6180614bd09cb2b311070" exitCode=143 Mar 18 13:29:30 crc kubenswrapper[4912]: I0318 13:29:30.849197 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerDied","Data":"01294588a82de26436ee31d92bb7321279d3293c00d6180614bd09cb2b311070"} Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.042746 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.267135 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.394156 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle\") pod \"2fceb44d-0e65-4c40-919d-4e7188c23c54\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.394239 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs\") pod \"2fceb44d-0e65-4c40-919d-4e7188c23c54\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.394370 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs\") pod \"2fceb44d-0e65-4c40-919d-4e7188c23c54\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.394414 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfc6h\" (UniqueName: \"kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h\") pod \"2fceb44d-0e65-4c40-919d-4e7188c23c54\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.394459 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data\") pod \"2fceb44d-0e65-4c40-919d-4e7188c23c54\" (UID: \"2fceb44d-0e65-4c40-919d-4e7188c23c54\") " Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.395260 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs" (OuterVolumeSpecName: "logs") pod "2fceb44d-0e65-4c40-919d-4e7188c23c54" (UID: "2fceb44d-0e65-4c40-919d-4e7188c23c54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.424432 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h" (OuterVolumeSpecName: "kube-api-access-wfc6h") pod "2fceb44d-0e65-4c40-919d-4e7188c23c54" (UID: "2fceb44d-0e65-4c40-919d-4e7188c23c54"). InnerVolumeSpecName "kube-api-access-wfc6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.441935 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data" (OuterVolumeSpecName: "config-data") pod "2fceb44d-0e65-4c40-919d-4e7188c23c54" (UID: "2fceb44d-0e65-4c40-919d-4e7188c23c54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.457223 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fceb44d-0e65-4c40-919d-4e7188c23c54" (UID: "2fceb44d-0e65-4c40-919d-4e7188c23c54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.498556 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.500371 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.500964 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.500992 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fceb44d-0e65-4c40-919d-4e7188c23c54-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.501006 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfc6h\" (UniqueName: \"kubernetes.io/projected/2fceb44d-0e65-4c40-919d-4e7188c23c54-kube-api-access-wfc6h\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.501020 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.502353 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2fceb44d-0e65-4c40-919d-4e7188c23c54" (UID: "2fceb44d-0e65-4c40-919d-4e7188c23c54"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.504650 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.504740 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerName="nova-scheduler-scheduler" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.603964 4912 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb44d-0e65-4c40-919d-4e7188c23c54-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.864303 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78b0bbd2-0094-4cf8-b5ae-624dde267b12","Type":"ContainerStarted","Data":"227cf19912f97455309b5e2e1cc746ea49681e05bff2385bb9d56c9731623901"} Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.864365 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78b0bbd2-0094-4cf8-b5ae-624dde267b12","Type":"ContainerStarted","Data":"54b62cecafaa72c19310512433196d79160130c5fe6a4f4c58f6d08cbf149950"} Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.866842 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.869337 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fceb44d-0e65-4c40-919d-4e7188c23c54","Type":"ContainerDied","Data":"d20110b256b10c97e8ba7532bfeb68fca8eb2b6acb8749d900a4d9fde47b40a9"} Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.869391 4912 scope.go:117] "RemoveContainer" containerID="3c95e0da09b94270afa42b8cde2bd69cd5e1e7b82c1ef3998b408722f1bb0265" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.869527 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.892556 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.89253317 podStartE2EDuration="1.89253317s" podCreationTimestamp="2026-03-18 13:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:31.890773672 +0000 UTC m=+1620.350201097" watchObservedRunningTime="2026-03-18 13:29:31.89253317 +0000 UTC m=+1620.351960595" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.903943 4912 scope.go:117] "RemoveContainer" containerID="c923e19527ae885588b71db706db252aa6c8db209f1faa1d8d94e9642a490057" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.927661 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.952627 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.969127 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.969832 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-metadata" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.969853 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-metadata" Mar 18 13:29:31 crc kubenswrapper[4912]: E0318 13:29:31.969889 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-log" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.969895 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-log" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.970224 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-log" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.970245 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" containerName="nova-metadata-metadata" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.971630 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.975136 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.975413 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 13:29:31 crc kubenswrapper[4912]: I0318 13:29:31.980334 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.046246 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.046352 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqk8\" (UniqueName: \"kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.046439 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.046463 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.046585 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.149508 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.149770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.150352 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqk8\" (UniqueName: \"kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.150675 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.150711 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.151760 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.154902 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.154914 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.155752 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.193547 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqk8\" (UniqueName: \"kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8\") pod \"nova-metadata-0\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.267168 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fceb44d-0e65-4c40-919d-4e7188c23c54" path="/var/lib/kubelet/pods/2fceb44d-0e65-4c40-919d-4e7188c23c54/volumes" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.295545 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.888573 4912 generic.go:334] "Generic (PLEG): container finished" podID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerID="15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" exitCode=0 Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.889249 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071","Type":"ContainerDied","Data":"15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d"} Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.889295 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071","Type":"ContainerDied","Data":"06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb"} Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.889316 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a60f47c94c61bb55fc0b58d932a31c1bb7db9fe33cc6a7208686c70f8be1eb" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.895282 4912 generic.go:334] "Generic (PLEG): container finished" podID="2a703388-f5f1-4975-9c2c-5ac152798930" containerID="d855dc2b8f4b2b41840914014e8856f43312b10cfe76c56df63f499bd087d986" exitCode=0 Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.896013 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nk4hg" event={"ID":"2a703388-f5f1-4975-9c2c-5ac152798930","Type":"ContainerDied","Data":"d855dc2b8f4b2b41840914014e8856f43312b10cfe76c56df63f499bd087d986"} Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.949564 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:32 crc kubenswrapper[4912]: I0318 13:29:32.996407 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.093822 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhfwx\" (UniqueName: \"kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx\") pod \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.093986 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle\") pod \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.094550 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data\") pod \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\" (UID: \"e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071\") " Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.101245 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx" (OuterVolumeSpecName: "kube-api-access-lhfwx") pod "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" (UID: "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071"). InnerVolumeSpecName "kube-api-access-lhfwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.130646 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" (UID: "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.134026 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data" (OuterVolumeSpecName: "config-data") pod "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" (UID: "e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.206051 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.206105 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhfwx\" (UniqueName: \"kubernetes.io/projected/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-kube-api-access-lhfwx\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.206122 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.918333 4912 generic.go:334] "Generic (PLEG): container finished" podID="d547696f-3142-42a2-8f44-3aee1b48849c" containerID="b2abd7423f9c0a8d672516a42218b1d0677e070950216e388718ea29420aa5fd" exitCode=0 Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.918459 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerDied","Data":"b2abd7423f9c0a8d672516a42218b1d0677e070950216e388718ea29420aa5fd"} Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.923051 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.927075 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerStarted","Data":"f2a076bd6f7cb3166561c238610017d2f2d91d31f5f7623809bf4d0b1410acb7"} Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.927116 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerStarted","Data":"fdf584e6131f430e893a76b345416cc3fc060f6768ab2ecb498d91aed7ce56be"} Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.927132 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerStarted","Data":"827e8780f38aad5082c39610b04e91871d68072f9d522ca7f4b4dbc11d6292d7"} Mar 18 13:29:33 crc kubenswrapper[4912]: I0318 13:29:33.963888 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.963862084 podStartE2EDuration="2.963862084s" podCreationTimestamp="2026-03-18 13:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:33.962569449 +0000 UTC m=+1622.421996884" watchObservedRunningTime="2026-03-18 13:29:33.963862084 +0000 UTC m=+1622.423289509" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.024778 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.058591 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.093089 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:34 crc kubenswrapper[4912]: E0318 13:29:34.093939 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerName="nova-scheduler-scheduler" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.093958 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerName="nova-scheduler-scheduler" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.094384 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" containerName="nova-scheduler-scheduler" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.095741 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.099683 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.112352 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.134065 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.171342 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.171787 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.172074 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdcd\" (UniqueName: \"kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.274945 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs\") pod \"d547696f-3142-42a2-8f44-3aee1b48849c\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.275838 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data\") pod \"d547696f-3142-42a2-8f44-3aee1b48849c\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.275907 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle\") pod \"d547696f-3142-42a2-8f44-3aee1b48849c\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.276022 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgfrl\" (UniqueName: \"kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl\") pod \"d547696f-3142-42a2-8f44-3aee1b48849c\" (UID: \"d547696f-3142-42a2-8f44-3aee1b48849c\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.276155 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs" (OuterVolumeSpecName: "logs") pod "d547696f-3142-42a2-8f44-3aee1b48849c" (UID: "d547696f-3142-42a2-8f44-3aee1b48849c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.288718 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.288981 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdcd\" (UniqueName: \"kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.289217 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.289555 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d547696f-3142-42a2-8f44-3aee1b48849c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.295354 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl" (OuterVolumeSpecName: "kube-api-access-pgfrl") pod "d547696f-3142-42a2-8f44-3aee1b48849c" (UID: "d547696f-3142-42a2-8f44-3aee1b48849c"). InnerVolumeSpecName "kube-api-access-pgfrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.297279 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.301803 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.306488 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071" path="/var/lib/kubelet/pods/e2cfe2a1-f3d3-4ece-aa38-2c1bd14c3071/volumes" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.317187 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdcd\" (UniqueName: \"kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd\") pod \"nova-scheduler-0\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.327129 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d547696f-3142-42a2-8f44-3aee1b48849c" (UID: "d547696f-3142-42a2-8f44-3aee1b48849c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.391414 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data" (OuterVolumeSpecName: "config-data") pod "d547696f-3142-42a2-8f44-3aee1b48849c" (UID: "d547696f-3142-42a2-8f44-3aee1b48849c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.394111 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.394142 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgfrl\" (UniqueName: \"kubernetes.io/projected/d547696f-3142-42a2-8f44-3aee1b48849c-kube-api-access-pgfrl\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.394156 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d547696f-3142-42a2-8f44-3aee1b48849c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.457706 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.461665 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.604324 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts\") pod \"2a703388-f5f1-4975-9c2c-5ac152798930\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.605893 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data\") pod \"2a703388-f5f1-4975-9c2c-5ac152798930\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.606077 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle\") pod \"2a703388-f5f1-4975-9c2c-5ac152798930\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.606177 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v8s9\" (UniqueName: \"kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9\") pod \"2a703388-f5f1-4975-9c2c-5ac152798930\" (UID: \"2a703388-f5f1-4975-9c2c-5ac152798930\") " Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.609518 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts" (OuterVolumeSpecName: "scripts") pod "2a703388-f5f1-4975-9c2c-5ac152798930" (UID: "2a703388-f5f1-4975-9c2c-5ac152798930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.612791 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9" (OuterVolumeSpecName: "kube-api-access-4v8s9") pod "2a703388-f5f1-4975-9c2c-5ac152798930" (UID: "2a703388-f5f1-4975-9c2c-5ac152798930"). InnerVolumeSpecName "kube-api-access-4v8s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.652704 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data" (OuterVolumeSpecName: "config-data") pod "2a703388-f5f1-4975-9c2c-5ac152798930" (UID: "2a703388-f5f1-4975-9c2c-5ac152798930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.653458 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a703388-f5f1-4975-9c2c-5ac152798930" (UID: "2a703388-f5f1-4975-9c2c-5ac152798930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.710407 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.710454 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.710468 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a703388-f5f1-4975-9c2c-5ac152798930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.710484 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v8s9\" (UniqueName: \"kubernetes.io/projected/2a703388-f5f1-4975-9c2c-5ac152798930-kube-api-access-4v8s9\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.950084 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d547696f-3142-42a2-8f44-3aee1b48849c","Type":"ContainerDied","Data":"21304a692e71b2b72f10e71d0aedeeab67bd5ce453457b327a2cf7b0fa311ed2"} Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.950169 4912 scope.go:117] "RemoveContainer" containerID="b2abd7423f9c0a8d672516a42218b1d0677e070950216e388718ea29420aa5fd" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.950513 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.967531 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nk4hg" event={"ID":"2a703388-f5f1-4975-9c2c-5ac152798930","Type":"ContainerDied","Data":"c191008c0a9b53ac9b4e9da84e27c308ddd2cebf7ab621559080ba449fce5ff0"} Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.967585 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nk4hg" Mar 18 13:29:34 crc kubenswrapper[4912]: I0318 13:29:34.967600 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c191008c0a9b53ac9b4e9da84e27c308ddd2cebf7ab621559080ba449fce5ff0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.001477 4912 scope.go:117] "RemoveContainer" containerID="01294588a82de26436ee31d92bb7321279d3293c00d6180614bd09cb2b311070" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.011920 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.043308 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.078228 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.089121 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: E0318 13:29:35.089973 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-log" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.089992 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-log" Mar 18 13:29:35 crc kubenswrapper[4912]: E0318 13:29:35.090024 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-api" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.090031 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-api" Mar 18 13:29:35 crc kubenswrapper[4912]: E0318 13:29:35.090250 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a703388-f5f1-4975-9c2c-5ac152798930" containerName="aodh-db-sync" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.090262 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a703388-f5f1-4975-9c2c-5ac152798930" containerName="aodh-db-sync" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.097531 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-api" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.097615 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" containerName="nova-api-log" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.097644 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a703388-f5f1-4975-9c2c-5ac152798930" containerName="aodh-db-sync" Mar 18 13:29:35 crc kubenswrapper[4912]: W0318 13:29:35.097853 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91ce66a9_93da_4cc6_9ed8_2b9f96b330ed.slice/crio-202daf2a7c70b968bb622456c26f6a47abb1f665f15a4364bd3b3d67e87fa492 WatchSource:0}: Error finding container 202daf2a7c70b968bb622456c26f6a47abb1f665f15a4364bd3b3d67e87fa492: Status 404 returned error can't find the container with id 202daf2a7c70b968bb622456c26f6a47abb1f665f15a4364bd3b3d67e87fa492 Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.118400 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.130991 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.131410 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpj2\" (UniqueName: \"kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.131492 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.131583 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.138686 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.157098 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.192117 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.208679 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.221790 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.227845 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.228140 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d7sqs" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.229027 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.236921 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.237210 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpj2\" (UniqueName: \"kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.237267 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.237303 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.238999 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.241704 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.245636 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.273367 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpj2\" (UniqueName: \"kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2\") pod \"nova-api-0\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.341846 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474n5\" (UniqueName: \"kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.342105 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.342282 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.342375 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.404004 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.445354 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-474n5\" (UniqueName: \"kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.445498 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.445576 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.445620 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.457680 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.459231 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.459548 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.466692 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-474n5\" (UniqueName: \"kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5\") pod \"aodh-0\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.734816 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.990967 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed","Type":"ContainerStarted","Data":"186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e"} Mar 18 13:29:35 crc kubenswrapper[4912]: I0318 13:29:35.991545 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed","Type":"ContainerStarted","Data":"202daf2a7c70b968bb622456c26f6a47abb1f665f15a4364bd3b3d67e87fa492"} Mar 18 13:29:36 crc kubenswrapper[4912]: I0318 13:29:36.016987 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.016956939 podStartE2EDuration="3.016956939s" podCreationTimestamp="2026-03-18 13:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:36.014049491 +0000 UTC m=+1624.473476906" watchObservedRunningTime="2026-03-18 13:29:36.016956939 +0000 UTC m=+1624.476384364" Mar 18 13:29:36 crc kubenswrapper[4912]: I0318 13:29:36.073539 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:36 crc kubenswrapper[4912]: I0318 13:29:36.250149 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d547696f-3142-42a2-8f44-3aee1b48849c" path="/var/lib/kubelet/pods/d547696f-3142-42a2-8f44-3aee1b48849c/volumes" Mar 18 13:29:36 crc kubenswrapper[4912]: I0318 13:29:36.464551 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:29:37 crc kubenswrapper[4912]: I0318 13:29:37.010186 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerStarted","Data":"4fb53b75e467158158daa6a9715652dd091bcbcc78064a1b928bdb56e79ff4ca"} Mar 18 13:29:37 crc kubenswrapper[4912]: I0318 13:29:37.022187 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerStarted","Data":"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9"} Mar 18 13:29:37 crc kubenswrapper[4912]: I0318 13:29:37.022264 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerStarted","Data":"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984"} Mar 18 13:29:37 crc kubenswrapper[4912]: I0318 13:29:37.022287 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerStarted","Data":"5e220a817eaf5f1163e3f4651f87112596755d2ca638232e44624fad3d4a84ec"} Mar 18 13:29:37 crc kubenswrapper[4912]: I0318 13:29:37.055062 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.055029986 podStartE2EDuration="2.055029986s" podCreationTimestamp="2026-03-18 13:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:37.053625628 +0000 UTC m=+1625.513053053" watchObservedRunningTime="2026-03-18 13:29:37.055029986 +0000 UTC m=+1625.514457411" Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.065942 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerStarted","Data":"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513"} Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.093682 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.094349 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="proxy-httpd" containerID="cri-o://57e0739129e4a6ae5899001611fc239450a00969b77839224c062e192482ad01" gracePeriod=30 Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.094535 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="sg-core" containerID="cri-o://0f42fbac48d34c4d44ae825497153e86acb258eca1cbf6882960d308f795eba2" gracePeriod=30 Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.094683 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-notification-agent" containerID="cri-o://e891f274822d69f2a65aaf57003f10c28a7a4e8758e170a32ed2956d883fc43f" gracePeriod=30 Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.094116 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-central-agent" containerID="cri-o://9254227e5b22f2b583f7fa29aeb0a107f3fdfde7a15bd360d4dade3513bdaf3d" gracePeriod=30 Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.204364 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.253:3000/\": read tcp 10.217.0.2:34818->10.217.0.253:3000: read: connection reset by peer" Mar 18 13:29:38 crc kubenswrapper[4912]: I0318 13:29:38.550547 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082543 4912 generic.go:334] "Generic (PLEG): container finished" podID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerID="57e0739129e4a6ae5899001611fc239450a00969b77839224c062e192482ad01" exitCode=0 Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082590 4912 generic.go:334] "Generic (PLEG): container finished" podID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerID="0f42fbac48d34c4d44ae825497153e86acb258eca1cbf6882960d308f795eba2" exitCode=2 Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082601 4912 generic.go:334] "Generic (PLEG): container finished" podID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerID="9254227e5b22f2b583f7fa29aeb0a107f3fdfde7a15bd360d4dade3513bdaf3d" exitCode=0 Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082644 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerDied","Data":"57e0739129e4a6ae5899001611fc239450a00969b77839224c062e192482ad01"} Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082722 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerDied","Data":"0f42fbac48d34c4d44ae825497153e86acb258eca1cbf6882960d308f795eba2"} Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.082744 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerDied","Data":"9254227e5b22f2b583f7fa29aeb0a107f3fdfde7a15bd360d4dade3513bdaf3d"} Mar 18 13:29:39 crc kubenswrapper[4912]: I0318 13:29:39.462690 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:29:40 crc kubenswrapper[4912]: I0318 13:29:40.512563 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.119910 4912 generic.go:334] "Generic (PLEG): container finished" podID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerID="e891f274822d69f2a65aaf57003f10c28a7a4e8758e170a32ed2956d883fc43f" exitCode=0 Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.119969 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerDied","Data":"e891f274822d69f2a65aaf57003f10c28a7a4e8758e170a32ed2956d883fc43f"} Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.217149 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372135 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372624 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372697 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpg55\" (UniqueName: \"kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372800 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372819 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372864 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.372938 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.373013 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data\") pod \"481be8f3-c3b2-4c54-9488-d1f710e706f3\" (UID: \"481be8f3-c3b2-4c54-9488-d1f710e706f3\") " Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.373154 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.373905 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.373928 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481be8f3-c3b2-4c54-9488-d1f710e706f3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.379280 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts" (OuterVolumeSpecName: "scripts") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.381934 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55" (OuterVolumeSpecName: "kube-api-access-fpg55") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "kube-api-access-fpg55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.416245 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.477936 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.478270 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpg55\" (UniqueName: \"kubernetes.io/projected/481be8f3-c3b2-4c54-9488-d1f710e706f3-kube-api-access-fpg55\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.478657 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.485357 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.517688 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data" (OuterVolumeSpecName: "config-data") pod "481be8f3-c3b2-4c54-9488-d1f710e706f3" (UID: "481be8f3-c3b2-4c54-9488-d1f710e706f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.582116 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:41 crc kubenswrapper[4912]: I0318 13:29:41.582567 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481be8f3-c3b2-4c54-9488-d1f710e706f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.140833 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerStarted","Data":"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8"} Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.146350 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481be8f3-c3b2-4c54-9488-d1f710e706f3","Type":"ContainerDied","Data":"cc2d81ff98d1b26ed673b7e44770b4836cd9c1bc0c3170330f9b1b59dc4b5fda"} Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.146421 4912 scope.go:117] "RemoveContainer" containerID="57e0739129e4a6ae5899001611fc239450a00969b77839224c062e192482ad01" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.147187 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.192825 4912 scope.go:117] "RemoveContainer" containerID="0f42fbac48d34c4d44ae825497153e86acb258eca1cbf6882960d308f795eba2" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.205327 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.232369 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.260912 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" path="/var/lib/kubelet/pods/481be8f3-c3b2-4c54-9488-d1f710e706f3/volumes" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.262143 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:42 crc kubenswrapper[4912]: E0318 13:29:42.262727 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="sg-core" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.262753 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="sg-core" Mar 18 13:29:42 crc kubenswrapper[4912]: E0318 13:29:42.262793 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-notification-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.262804 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-notification-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: E0318 13:29:42.262861 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-central-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.262873 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-central-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: E0318 13:29:42.262894 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="proxy-httpd" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.262904 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="proxy-httpd" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.263608 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="sg-core" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.263646 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-notification-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.263677 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="ceilometer-central-agent" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.263689 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="481be8f3-c3b2-4c54-9488-d1f710e706f3" containerName="proxy-httpd" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.267245 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.268770 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.270726 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.275291 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.291005 4912 scope.go:117] "RemoveContainer" containerID="e891f274822d69f2a65aaf57003f10c28a7a4e8758e170a32ed2956d883fc43f" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.295708 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.295793 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.310952 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311151 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311212 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311238 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311377 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311482 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.311659 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pqcj\" (UniqueName: \"kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.333232 4912 scope.go:117] "RemoveContainer" containerID="9254227e5b22f2b583f7fa29aeb0a107f3fdfde7a15bd360d4dade3513bdaf3d" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.414552 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.414613 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.414729 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.414799 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.415018 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pqcj\" (UniqueName: \"kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.415290 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.415442 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.416522 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.417840 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.424418 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.424486 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.426754 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.428639 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.450871 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pqcj\" (UniqueName: \"kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj\") pod \"ceilometer-0\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " pod="openstack/ceilometer-0" Mar 18 13:29:42 crc kubenswrapper[4912]: I0318 13:29:42.600907 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:29:43 crc kubenswrapper[4912]: I0318 13:29:43.362953 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:43 crc kubenswrapper[4912]: I0318 13:29:43.362622 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:43 crc kubenswrapper[4912]: I0318 13:29:43.525629 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:44 crc kubenswrapper[4912]: I0318 13:29:44.204322 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerStarted","Data":"a312297b694974799b66cff20063ec5a0a6986512914910d89943c50538aa144"} Mar 18 13:29:44 crc kubenswrapper[4912]: I0318 13:29:44.207640 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerStarted","Data":"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706"} Mar 18 13:29:44 crc kubenswrapper[4912]: I0318 13:29:44.463111 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 13:29:44 crc kubenswrapper[4912]: I0318 13:29:44.513390 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.223680 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerStarted","Data":"c0eb26bb7742489435ac61860d44792fc45eaa49bdf11efe8a9a4b932572bb14"} Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.232235 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerStarted","Data":"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1"} Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.232388 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-api" containerID="cri-o://485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" gracePeriod=30 Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.232438 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-listener" containerID="cri-o://2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" gracePeriod=30 Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.232525 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-evaluator" containerID="cri-o://b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" gracePeriod=30 Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.232509 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-notifier" containerID="cri-o://370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" gracePeriod=30 Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.291671 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.164773197 podStartE2EDuration="10.291641646s" podCreationTimestamp="2026-03-18 13:29:35 +0000 UTC" firstStartedPulling="2026-03-18 13:29:36.483219254 +0000 UTC m=+1624.942646679" lastFinishedPulling="2026-03-18 13:29:44.610087703 +0000 UTC m=+1633.069515128" observedRunningTime="2026-03-18 13:29:45.259260436 +0000 UTC m=+1633.718687871" watchObservedRunningTime="2026-03-18 13:29:45.291641646 +0000 UTC m=+1633.751069071" Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.326343 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.406385 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:29:45 crc kubenswrapper[4912]: I0318 13:29:45.406477 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.252576 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerStarted","Data":"1780a49a9bd91b3ea6c7bcd141d2c584abf09e942380efb6d2f3e82337a775e6"} Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.257297 4912 generic.go:334] "Generic (PLEG): container finished" podID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerID="b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" exitCode=0 Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.257335 4912 generic.go:334] "Generic (PLEG): container finished" podID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerID="485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" exitCode=0 Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.257373 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerDied","Data":"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8"} Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.257438 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerDied","Data":"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513"} Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.489284 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:46 crc kubenswrapper[4912]: I0318 13:29:46.489297 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:29:47 crc kubenswrapper[4912]: I0318 13:29:47.275072 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerStarted","Data":"0015aeed172d4b010017acc4c88822d119bb3a30f137ea78353a10029aff1fde"} Mar 18 13:29:49 crc kubenswrapper[4912]: I0318 13:29:49.320474 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerStarted","Data":"0e80cbbf83514aeec10b89b22d803c6276231f6984c93000ac7ace05961c0340"} Mar 18 13:29:49 crc kubenswrapper[4912]: I0318 13:29:49.322843 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:29:49 crc kubenswrapper[4912]: I0318 13:29:49.354543 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.988650418 podStartE2EDuration="7.354516231s" podCreationTimestamp="2026-03-18 13:29:42 +0000 UTC" firstStartedPulling="2026-03-18 13:29:43.542207776 +0000 UTC m=+1632.001635191" lastFinishedPulling="2026-03-18 13:29:48.908073579 +0000 UTC m=+1637.367501004" observedRunningTime="2026-03-18 13:29:49.347911674 +0000 UTC m=+1637.807339099" watchObservedRunningTime="2026-03-18 13:29:49.354516231 +0000 UTC m=+1637.813943656" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.059341 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.169964 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data\") pod \"37e6287a-9a5f-44a2-b9ca-a855075fd554\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.170303 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75zpb\" (UniqueName: \"kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb\") pod \"37e6287a-9a5f-44a2-b9ca-a855075fd554\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.170353 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle\") pod \"37e6287a-9a5f-44a2-b9ca-a855075fd554\" (UID: \"37e6287a-9a5f-44a2-b9ca-a855075fd554\") " Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.177286 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb" (OuterVolumeSpecName: "kube-api-access-75zpb") pod "37e6287a-9a5f-44a2-b9ca-a855075fd554" (UID: "37e6287a-9a5f-44a2-b9ca-a855075fd554"). InnerVolumeSpecName "kube-api-access-75zpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.206259 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37e6287a-9a5f-44a2-b9ca-a855075fd554" (UID: "37e6287a-9a5f-44a2-b9ca-a855075fd554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.214883 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data" (OuterVolumeSpecName: "config-data") pod "37e6287a-9a5f-44a2-b9ca-a855075fd554" (UID: "37e6287a-9a5f-44a2-b9ca-a855075fd554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.273755 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75zpb\" (UniqueName: \"kubernetes.io/projected/37e6287a-9a5f-44a2-b9ca-a855075fd554-kube-api-access-75zpb\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.273819 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.273837 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e6287a-9a5f-44a2-b9ca-a855075fd554-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.296144 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.296225 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.337084 4912 generic.go:334] "Generic (PLEG): container finished" podID="37e6287a-9a5f-44a2-b9ca-a855075fd554" containerID="a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80" exitCode=137 Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.337164 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"37e6287a-9a5f-44a2-b9ca-a855075fd554","Type":"ContainerDied","Data":"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80"} Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.337188 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.337271 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"37e6287a-9a5f-44a2-b9ca-a855075fd554","Type":"ContainerDied","Data":"e387ef6c10c90a60a8702de8c702c479b11a27cc5a7cb90f7897c519403e8fab"} Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.337325 4912 scope.go:117] "RemoveContainer" containerID="a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.383562 4912 scope.go:117] "RemoveContainer" containerID="a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80" Mar 18 13:29:50 crc kubenswrapper[4912]: E0318 13:29:50.384407 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80\": container with ID starting with a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80 not found: ID does not exist" containerID="a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.384455 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80"} err="failed to get container status \"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80\": rpc error: code = NotFound desc = could not find container \"a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80\": container with ID starting with a64b960df9c38eee3844ab3fbd31d274c9b27cce38387d1267c11639cda43d80 not found: ID does not exist" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.396078 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.418254 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.430775 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:50 crc kubenswrapper[4912]: E0318 13:29:50.431416 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e6287a-9a5f-44a2-b9ca-a855075fd554" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.431436 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e6287a-9a5f-44a2-b9ca-a855075fd554" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.431718 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e6287a-9a5f-44a2-b9ca-a855075fd554" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.432835 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.435190 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.435409 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.436809 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.447108 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.581855 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhj2\" (UniqueName: \"kubernetes.io/projected/7358c044-d1cd-4087-b377-06d6bf36d82b-kube-api-access-7lhj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.582291 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.582469 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.582565 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.582699 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.685093 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhj2\" (UniqueName: \"kubernetes.io/projected/7358c044-d1cd-4087-b377-06d6bf36d82b-kube-api-access-7lhj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.685157 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.685225 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.685251 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.685269 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.690583 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.691374 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.699245 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.706366 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7358c044-d1cd-4087-b377-06d6bf36d82b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.716024 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhj2\" (UniqueName: \"kubernetes.io/projected/7358c044-d1cd-4087-b377-06d6bf36d82b-kube-api-access-7lhj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"7358c044-d1cd-4087-b377-06d6bf36d82b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:50 crc kubenswrapper[4912]: I0318 13:29:50.755363 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:51 crc kubenswrapper[4912]: I0318 13:29:51.279500 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:29:51 crc kubenswrapper[4912]: W0318 13:29:51.285527 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7358c044_d1cd_4087_b377_06d6bf36d82b.slice/crio-b2e536ed83bacbacbe50e1fcdc4514255519915605d632b42595282d036d6466 WatchSource:0}: Error finding container b2e536ed83bacbacbe50e1fcdc4514255519915605d632b42595282d036d6466: Status 404 returned error can't find the container with id b2e536ed83bacbacbe50e1fcdc4514255519915605d632b42595282d036d6466 Mar 18 13:29:51 crc kubenswrapper[4912]: I0318 13:29:51.350939 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7358c044-d1cd-4087-b377-06d6bf36d82b","Type":"ContainerStarted","Data":"b2e536ed83bacbacbe50e1fcdc4514255519915605d632b42595282d036d6466"} Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.254345 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e6287a-9a5f-44a2-b9ca-a855075fd554" path="/var/lib/kubelet/pods/37e6287a-9a5f-44a2-b9ca-a855075fd554/volumes" Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.302470 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.304218 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.307438 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.372713 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7358c044-d1cd-4087-b377-06d6bf36d82b","Type":"ContainerStarted","Data":"369312e54cfd0aeac9a5e9eb1168ab798f6c9b7031307c9ac6753956bea8dee1"} Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.387210 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:29:52 crc kubenswrapper[4912]: I0318 13:29:52.399814 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.399792889 podStartE2EDuration="2.399792889s" podCreationTimestamp="2026-03-18 13:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:52.391367293 +0000 UTC m=+1640.850794728" watchObservedRunningTime="2026-03-18 13:29:52.399792889 +0000 UTC m=+1640.859220314" Mar 18 13:29:53 crc kubenswrapper[4912]: I0318 13:29:53.406351 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:29:53 crc kubenswrapper[4912]: I0318 13:29:53.407350 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.409504 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.410353 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.415656 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.416344 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.642453 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.669386 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.689827 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.755976 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.795101 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rzw\" (UniqueName: \"kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.796377 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.796700 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.796854 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.797131 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.797673 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.900256 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.900418 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901478 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901475 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901525 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901632 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901722 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901783 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rzw\" (UniqueName: \"kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.901483 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.902522 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.903023 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:55 crc kubenswrapper[4912]: I0318 13:29:55.930815 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rzw\" (UniqueName: \"kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw\") pod \"dnsmasq-dns-f84f9ccf-92qkk\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:56 crc kubenswrapper[4912]: I0318 13:29:56.040832 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:56 crc kubenswrapper[4912]: I0318 13:29:56.659836 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:29:57 crc kubenswrapper[4912]: I0318 13:29:57.445989 4912 generic.go:334] "Generic (PLEG): container finished" podID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerID="c44e2c6a426590aab32be8162631d5f44bce924c5728f58a465190cb4f5717ce" exitCode=0 Mar 18 13:29:57 crc kubenswrapper[4912]: I0318 13:29:57.446937 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" event={"ID":"8bd4ffa4-2efe-40eb-98ab-231ef0283598","Type":"ContainerDied","Data":"c44e2c6a426590aab32be8162631d5f44bce924c5728f58a465190cb4f5717ce"} Mar 18 13:29:57 crc kubenswrapper[4912]: I0318 13:29:57.447018 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" event={"ID":"8bd4ffa4-2efe-40eb-98ab-231ef0283598","Type":"ContainerStarted","Data":"c5bdba073f1b259a6dcc5d3078bcf0ecbe0ed45885da45ba4ba829507a570448"} Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.476790 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" event={"ID":"8bd4ffa4-2efe-40eb-98ab-231ef0283598","Type":"ContainerStarted","Data":"0d651a2c69bf53833e90e04ae8ddb21cd68b4525cfcf7e3d5e48531edd7ab09b"} Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.477607 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.517791 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" podStartSLOduration=3.517770562 podStartE2EDuration="3.517770562s" podCreationTimestamp="2026-03-18 13:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:29:58.513153008 +0000 UTC m=+1646.972580443" watchObservedRunningTime="2026-03-18 13:29:58.517770562 +0000 UTC m=+1646.977197987" Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.581375 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.581643 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-log" containerID="cri-o://dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984" gracePeriod=30 Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.581788 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-api" containerID="cri-o://b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9" gracePeriod=30 Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.856939 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.857307 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-central-agent" containerID="cri-o://c0eb26bb7742489435ac61860d44792fc45eaa49bdf11efe8a9a4b932572bb14" gracePeriod=30 Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.857357 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="proxy-httpd" containerID="cri-o://0e80cbbf83514aeec10b89b22d803c6276231f6984c93000ac7ace05961c0340" gracePeriod=30 Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.857367 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-notification-agent" containerID="cri-o://1780a49a9bd91b3ea6c7bcd141d2c584abf09e942380efb6d2f3e82337a775e6" gracePeriod=30 Mar 18 13:29:58 crc kubenswrapper[4912]: I0318 13:29:58.857392 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="sg-core" containerID="cri-o://0015aeed172d4b010017acc4c88822d119bb3a30f137ea78353a10029aff1fde" gracePeriod=30 Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521221 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerID="0e80cbbf83514aeec10b89b22d803c6276231f6984c93000ac7ace05961c0340" exitCode=0 Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521716 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerID="0015aeed172d4b010017acc4c88822d119bb3a30f137ea78353a10029aff1fde" exitCode=2 Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521732 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerID="c0eb26bb7742489435ac61860d44792fc45eaa49bdf11efe8a9a4b932572bb14" exitCode=0 Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521858 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerDied","Data":"0e80cbbf83514aeec10b89b22d803c6276231f6984c93000ac7ace05961c0340"} Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521902 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerDied","Data":"0015aeed172d4b010017acc4c88822d119bb3a30f137ea78353a10029aff1fde"} Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.521916 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerDied","Data":"c0eb26bb7742489435ac61860d44792fc45eaa49bdf11efe8a9a4b932572bb14"} Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.532652 4912 generic.go:334] "Generic (PLEG): container finished" podID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerID="dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984" exitCode=143 Mar 18 13:29:59 crc kubenswrapper[4912]: I0318 13:29:59.533418 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerDied","Data":"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984"} Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.148503 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564010-wh5fm"] Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.150854 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.153273 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.153286 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.155380 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.170166 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf"] Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.172851 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.178131 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.178174 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.185760 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-wh5fm"] Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.205251 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf"] Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.246773 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.246893 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.247890 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8hgn\" (UniqueName: \"kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn\") pod \"auto-csr-approver-29564010-wh5fm\" (UID: \"4dbccc26-4a01-47c6-a224-7b8355108dfa\") " pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.247985 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2dw\" (UniqueName: \"kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.350655 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.351971 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8hgn\" (UniqueName: \"kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn\") pod \"auto-csr-approver-29564010-wh5fm\" (UID: \"4dbccc26-4a01-47c6-a224-7b8355108dfa\") " pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.352028 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2dw\" (UniqueName: \"kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.352203 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.353115 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.367414 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.374092 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2dw\" (UniqueName: \"kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw\") pod \"collect-profiles-29564010-nkjcf\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.374423 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8hgn\" (UniqueName: \"kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn\") pod \"auto-csr-approver-29564010-wh5fm\" (UID: \"4dbccc26-4a01-47c6-a224-7b8355108dfa\") " pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.480348 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.498791 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.756017 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:30:00 crc kubenswrapper[4912]: I0318 13:30:00.785514 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.092988 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf"] Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.246105 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-wh5fm"] Mar 18 13:30:01 crc kubenswrapper[4912]: W0318 13:30:01.255298 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbccc26_4a01_47c6_a224_7b8355108dfa.slice/crio-8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e WatchSource:0}: Error finding container 8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e: Status 404 returned error can't find the container with id 8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.570904 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" event={"ID":"07d9949b-baff-4ef3-8879-da61b30d7b24","Type":"ContainerStarted","Data":"c4e2b0d1399d0ac74142743a08b0df0ec8b7bf94c0c981e6e09748d0c7b96f33"} Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.570970 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" event={"ID":"07d9949b-baff-4ef3-8879-da61b30d7b24","Type":"ContainerStarted","Data":"4bc4fdbaf7e14f70a4920fcac5ae2b011491c9faed03fd15241f4a699db07795"} Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.572909 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" event={"ID":"4dbccc26-4a01-47c6-a224-7b8355108dfa","Type":"ContainerStarted","Data":"8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e"} Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.593510 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" podStartSLOduration=1.5934789390000001 podStartE2EDuration="1.593478939s" podCreationTimestamp="2026-03-18 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:01.591165827 +0000 UTC m=+1650.050593252" watchObservedRunningTime="2026-03-18 13:30:01.593478939 +0000 UTC m=+1650.052906384" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.601246 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.866272 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v6qzs"] Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.869004 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.872703 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.872955 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.882460 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6qzs"] Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.921265 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.921375 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.921450 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvnf\" (UniqueName: \"kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:01 crc kubenswrapper[4912]: I0318 13:30:01.921496 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.027487 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvnf\" (UniqueName: \"kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.027623 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.027826 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.027979 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.044332 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.045656 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.046067 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.053582 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvnf\" (UniqueName: \"kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf\") pod \"nova-cell1-cell-mapping-v6qzs\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.309429 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.549766 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.625564 4912 generic.go:334] "Generic (PLEG): container finished" podID="07d9949b-baff-4ef3-8879-da61b30d7b24" containerID="c4e2b0d1399d0ac74142743a08b0df0ec8b7bf94c0c981e6e09748d0c7b96f33" exitCode=0 Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.625643 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" event={"ID":"07d9949b-baff-4ef3-8879-da61b30d7b24","Type":"ContainerDied","Data":"c4e2b0d1399d0ac74142743a08b0df0ec8b7bf94c0c981e6e09748d0c7b96f33"} Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.641407 4912 generic.go:334] "Generic (PLEG): container finished" podID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerID="b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9" exitCode=0 Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.641610 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.642856 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerDied","Data":"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9"} Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.642908 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1102f18c-97a0-41a7-b08b-f208fa48ec08","Type":"ContainerDied","Data":"5e220a817eaf5f1163e3f4651f87112596755d2ca638232e44624fad3d4a84ec"} Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.642934 4912 scope.go:117] "RemoveContainer" containerID="b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.680597 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs\") pod \"1102f18c-97a0-41a7-b08b-f208fa48ec08\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.680748 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle\") pod \"1102f18c-97a0-41a7-b08b-f208fa48ec08\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.680790 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpj2\" (UniqueName: \"kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2\") pod \"1102f18c-97a0-41a7-b08b-f208fa48ec08\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.681070 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data\") pod \"1102f18c-97a0-41a7-b08b-f208fa48ec08\" (UID: \"1102f18c-97a0-41a7-b08b-f208fa48ec08\") " Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.681373 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs" (OuterVolumeSpecName: "logs") pod "1102f18c-97a0-41a7-b08b-f208fa48ec08" (UID: "1102f18c-97a0-41a7-b08b-f208fa48ec08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.685353 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1102f18c-97a0-41a7-b08b-f208fa48ec08-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.693491 4912 scope.go:117] "RemoveContainer" containerID="dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.695382 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2" (OuterVolumeSpecName: "kube-api-access-klpj2") pod "1102f18c-97a0-41a7-b08b-f208fa48ec08" (UID: "1102f18c-97a0-41a7-b08b-f208fa48ec08"). InnerVolumeSpecName "kube-api-access-klpj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.763075 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1102f18c-97a0-41a7-b08b-f208fa48ec08" (UID: "1102f18c-97a0-41a7-b08b-f208fa48ec08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.790429 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.790465 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpj2\" (UniqueName: \"kubernetes.io/projected/1102f18c-97a0-41a7-b08b-f208fa48ec08-kube-api-access-klpj2\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.806407 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data" (OuterVolumeSpecName: "config-data") pod "1102f18c-97a0-41a7-b08b-f208fa48ec08" (UID: "1102f18c-97a0-41a7-b08b-f208fa48ec08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.894476 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1102f18c-97a0-41a7-b08b-f208fa48ec08-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:02 crc kubenswrapper[4912]: W0318 13:30:02.900391 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77126f44_41a8_416e_b198_fd0242a64bb9.slice/crio-e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775 WatchSource:0}: Error finding container e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775: Status 404 returned error can't find the container with id e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775 Mar 18 13:30:02 crc kubenswrapper[4912]: I0318 13:30:02.904717 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6qzs"] Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.074095 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.096179 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.110418 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:03 crc kubenswrapper[4912]: E0318 13:30:03.111309 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-api" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.111336 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-api" Mar 18 13:30:03 crc kubenswrapper[4912]: E0318 13:30:03.111350 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-log" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.111357 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-log" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.111693 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-api" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.111767 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" containerName="nova-api-log" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.113666 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.119153 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.119699 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.119897 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.122073 4912 scope.go:117] "RemoveContainer" containerID="b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9" Mar 18 13:30:03 crc kubenswrapper[4912]: E0318 13:30:03.123478 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9\": container with ID starting with b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9 not found: ID does not exist" containerID="b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.123551 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9"} err="failed to get container status \"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9\": rpc error: code = NotFound desc = could not find container \"b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9\": container with ID starting with b01c1d494f9a2994e262c2aab3f2e808530208381f608a57d2beb3e02d6796b9 not found: ID does not exist" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.123586 4912 scope.go:117] "RemoveContainer" containerID="dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984" Mar 18 13:30:03 crc kubenswrapper[4912]: E0318 13:30:03.124117 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984\": container with ID starting with dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984 not found: ID does not exist" containerID="dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.124142 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984"} err="failed to get container status \"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984\": rpc error: code = NotFound desc = could not find container \"dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984\": container with ID starting with dc1be307f7df8705387e03349f7d634d1597f3245d903475bba2b30967c69984 not found: ID does not exist" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.128642 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.208430 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.208980 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.209070 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vq4\" (UniqueName: \"kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.209317 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.209517 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.209742 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.313236 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.327201 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.327434 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.327469 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vq4\" (UniqueName: \"kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.327584 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.327631 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.314159 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.344803 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.350760 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.351192 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.353083 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.379619 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vq4\" (UniqueName: \"kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4\") pod \"nova-api-0\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.449805 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.785642 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6qzs" event={"ID":"77126f44-41a8-416e-b198-fd0242a64bb9","Type":"ContainerStarted","Data":"5e2609728611dffb36be40cc38777d196d9cb3d47a8be3589569abd966d3fb8d"} Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.786286 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6qzs" event={"ID":"77126f44-41a8-416e-b198-fd0242a64bb9","Type":"ContainerStarted","Data":"e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775"} Mar 18 13:30:03 crc kubenswrapper[4912]: I0318 13:30:03.896810 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v6qzs" podStartSLOduration=2.89677632 podStartE2EDuration="2.89677632s" podCreationTimestamp="2026-03-18 13:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:03.840494357 +0000 UTC m=+1652.299921782" watchObservedRunningTime="2026-03-18 13:30:03.89677632 +0000 UTC m=+1652.356203755" Mar 18 13:30:04 crc kubenswrapper[4912]: I0318 13:30:04.278086 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1102f18c-97a0-41a7-b08b-f208fa48ec08" path="/var/lib/kubelet/pods/1102f18c-97a0-41a7-b08b-f208fa48ec08/volumes" Mar 18 13:30:04 crc kubenswrapper[4912]: I0318 13:30:04.872008 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:04 crc kubenswrapper[4912]: I0318 13:30:04.910917 4912 generic.go:334] "Generic (PLEG): container finished" podID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerID="1780a49a9bd91b3ea6c7bcd141d2c584abf09e942380efb6d2f3e82337a775e6" exitCode=0 Mar 18 13:30:04 crc kubenswrapper[4912]: I0318 13:30:04.913077 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerDied","Data":"1780a49a9bd91b3ea6c7bcd141d2c584abf09e942380efb6d2f3e82337a775e6"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.214494 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.242679 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.366627 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume\") pod \"07d9949b-baff-4ef3-8879-da61b30d7b24\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.366767 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.366807 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.366866 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w2dw\" (UniqueName: \"kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw\") pod \"07d9949b-baff-4ef3-8879-da61b30d7b24\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.366942 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pqcj\" (UniqueName: \"kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.367127 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume\") pod \"07d9949b-baff-4ef3-8879-da61b30d7b24\" (UID: \"07d9949b-baff-4ef3-8879-da61b30d7b24\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.367232 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.367290 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.367806 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.367862 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml\") pod \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\" (UID: \"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b\") " Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.369711 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.370388 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume" (OuterVolumeSpecName: "config-volume") pod "07d9949b-baff-4ef3-8879-da61b30d7b24" (UID: "07d9949b-baff-4ef3-8879-da61b30d7b24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.372938 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.387158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07d9949b-baff-4ef3-8879-da61b30d7b24" (UID: "07d9949b-baff-4ef3-8879-da61b30d7b24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.388306 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw" (OuterVolumeSpecName: "kube-api-access-7w2dw") pod "07d9949b-baff-4ef3-8879-da61b30d7b24" (UID: "07d9949b-baff-4ef3-8879-da61b30d7b24"). InnerVolumeSpecName "kube-api-access-7w2dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.390525 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj" (OuterVolumeSpecName: "kube-api-access-8pqcj") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "kube-api-access-8pqcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.407992 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts" (OuterVolumeSpecName: "scripts") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471336 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w2dw\" (UniqueName: \"kubernetes.io/projected/07d9949b-baff-4ef3-8879-da61b30d7b24-kube-api-access-7w2dw\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471372 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pqcj\" (UniqueName: \"kubernetes.io/projected/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-kube-api-access-8pqcj\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471385 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d9949b-baff-4ef3-8879-da61b30d7b24-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471396 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471406 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471414 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.471423 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d9949b-baff-4ef3-8879-da61b30d7b24-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.473720 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.554928 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.574767 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.574802 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.597761 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data" (OuterVolumeSpecName: "config-data") pod "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" (UID: "f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.678198 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.937342 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" event={"ID":"07d9949b-baff-4ef3-8879-da61b30d7b24","Type":"ContainerDied","Data":"4bc4fdbaf7e14f70a4920fcac5ae2b011491c9faed03fd15241f4a699db07795"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.937777 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc4fdbaf7e14f70a4920fcac5ae2b011491c9faed03fd15241f4a699db07795" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.937918 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.943338 4912 generic.go:334] "Generic (PLEG): container finished" podID="4dbccc26-4a01-47c6-a224-7b8355108dfa" containerID="78dc5a609a0c2582918480af69b2d8c185274a6637f5c1f26c4d16edc484f34b" exitCode=0 Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.943359 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" event={"ID":"4dbccc26-4a01-47c6-a224-7b8355108dfa","Type":"ContainerDied","Data":"78dc5a609a0c2582918480af69b2d8c185274a6637f5c1f26c4d16edc484f34b"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.948569 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerStarted","Data":"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.948625 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerStarted","Data":"60c92c02e39382ee3e302d66ee6fb75825e9774191cbb2887f79f4d768f4332c"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.956601 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b","Type":"ContainerDied","Data":"a312297b694974799b66cff20063ec5a0a6986512914910d89943c50538aa144"} Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.956692 4912 scope.go:117] "RemoveContainer" containerID="0e80cbbf83514aeec10b89b22d803c6276231f6984c93000ac7ace05961c0340" Mar 18 13:30:05 crc kubenswrapper[4912]: I0318 13:30:05.956778 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.011416 4912 scope.go:117] "RemoveContainer" containerID="0015aeed172d4b010017acc4c88822d119bb3a30f137ea78353a10029aff1fde" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.037230 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.044361 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.069164 4912 scope.go:117] "RemoveContainer" containerID="1780a49a9bd91b3ea6c7bcd141d2c584abf09e942380efb6d2f3e82337a775e6" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.092071 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.123282 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:06 crc kubenswrapper[4912]: E0318 13:30:06.124295 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-central-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124339 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-central-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: E0318 13:30:06.124375 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d9949b-baff-4ef3-8879-da61b30d7b24" containerName="collect-profiles" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124383 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d9949b-baff-4ef3-8879-da61b30d7b24" containerName="collect-profiles" Mar 18 13:30:06 crc kubenswrapper[4912]: E0318 13:30:06.124399 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-notification-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124409 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-notification-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: E0318 13:30:06.124443 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="sg-core" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124463 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="sg-core" Mar 18 13:30:06 crc kubenswrapper[4912]: E0318 13:30:06.124498 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="proxy-httpd" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124505 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="proxy-httpd" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124848 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-notification-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124875 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="sg-core" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124888 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="proxy-httpd" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124901 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" containerName="ceilometer-central-agent" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.124922 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d9949b-baff-4ef3-8879-da61b30d7b24" containerName="collect-profiles" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.147882 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.157151 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.157530 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.167541 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.210781 4912 scope.go:117] "RemoveContainer" containerID="c0eb26bb7742489435ac61860d44792fc45eaa49bdf11efe8a9a4b932572bb14" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.223249 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.223577 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="dnsmasq-dns" containerID="cri-o://7ece0b5f58e6c98123171222e8d867e616adce174afc06b66271d954c4e6a6ce" gracePeriod=10 Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.303940 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304017 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304090 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304120 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304155 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdrqg\" (UniqueName: \"kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304180 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.304740 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.407712 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.418959 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.419131 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.419215 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.419287 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.419352 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdrqg\" (UniqueName: \"kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.419404 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.427792 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.429764 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.430776 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b" path="/var/lib/kubelet/pods/f4a9511a-c3e7-4b52-bd9e-67d2aca9a62b/volumes" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.443673 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.449561 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.453115 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.455638 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdrqg\" (UniqueName: \"kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.471925 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.531767 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.924391 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.250:5353: connect: connection refused" Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.982555 4912 generic.go:334] "Generic (PLEG): container finished" podID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerID="7ece0b5f58e6c98123171222e8d867e616adce174afc06b66271d954c4e6a6ce" exitCode=0 Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.982653 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" event={"ID":"e93b7e52-0a4c-4372-a747-43fc785c0990","Type":"ContainerDied","Data":"7ece0b5f58e6c98123171222e8d867e616adce174afc06b66271d954c4e6a6ce"} Mar 18 13:30:06 crc kubenswrapper[4912]: I0318 13:30:06.986455 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerStarted","Data":"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b"} Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.023423 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.023399125 podStartE2EDuration="4.023399125s" podCreationTimestamp="2026-03-18 13:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:07.017664411 +0000 UTC m=+1655.477091846" watchObservedRunningTime="2026-03-18 13:30:07.023399125 +0000 UTC m=+1655.482826550" Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.103103 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:07 crc kubenswrapper[4912]: W0318 13:30:07.146641 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad86d4bc_a71a_4678_ae24_69c2a32ec4c3.slice/crio-e9f9570454e3f7ae8b37e9c4d8cc8b848ed77262136fd2ccd5c5c662077c041a WatchSource:0}: Error finding container e9f9570454e3f7ae8b37e9c4d8cc8b848ed77262136fd2ccd5c5c662077c041a: Status 404 returned error can't find the container with id e9f9570454e3f7ae8b37e9c4d8cc8b848ed77262136fd2ccd5c5c662077c041a Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.708687 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.710715 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.881755 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882488 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8hgn\" (UniqueName: \"kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn\") pod \"4dbccc26-4a01-47c6-a224-7b8355108dfa\" (UID: \"4dbccc26-4a01-47c6-a224-7b8355108dfa\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882639 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882682 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882722 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882769 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.882858 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb4k4\" (UniqueName: \"kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4\") pod \"e93b7e52-0a4c-4372-a747-43fc785c0990\" (UID: \"e93b7e52-0a4c-4372-a747-43fc785c0990\") " Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.925016 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4" (OuterVolumeSpecName: "kube-api-access-kb4k4") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "kube-api-access-kb4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.940150 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn" (OuterVolumeSpecName: "kube-api-access-h8hgn") pod "4dbccc26-4a01-47c6-a224-7b8355108dfa" (UID: "4dbccc26-4a01-47c6-a224-7b8355108dfa"). InnerVolumeSpecName "kube-api-access-h8hgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:07 crc kubenswrapper[4912]: I0318 13:30:07.988373 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb4k4\" (UniqueName: \"kubernetes.io/projected/e93b7e52-0a4c-4372-a747-43fc785c0990-kube-api-access-kb4k4\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.002154 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8hgn\" (UniqueName: \"kubernetes.io/projected/4dbccc26-4a01-47c6-a224-7b8355108dfa-kube-api-access-h8hgn\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.102096 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config" (OuterVolumeSpecName: "config") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.110265 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.123424 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" event={"ID":"4dbccc26-4a01-47c6-a224-7b8355108dfa","Type":"ContainerDied","Data":"8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e"} Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.123505 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fdc7e3abc5b2e088674c5947957ab0b0106a2dfe9bdf625027adc0bc81f6b6e" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.123629 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-wh5fm" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.130813 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerStarted","Data":"e9f9570454e3f7ae8b37e9c4d8cc8b848ed77262136fd2ccd5c5c662077c041a"} Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.138299 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.138538 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-47t7b" event={"ID":"e93b7e52-0a4c-4372-a747-43fc785c0990","Type":"ContainerDied","Data":"76d8209de6757f1e848e1870313ee04b5895ff697bb8c59ae8bc36362a7ad02a"} Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.138625 4912 scope.go:117] "RemoveContainer" containerID="7ece0b5f58e6c98123171222e8d867e616adce174afc06b66271d954c4e6a6ce" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.139511 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.182144 4912 scope.go:117] "RemoveContainer" containerID="adf9f7a50dfe50e2deaf98dc66900e042b0e6803b1f47ee8dd5c7421a17600f9" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.192112 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.196293 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.212675 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.212721 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.212733 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.214450 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e93b7e52-0a4c-4372-a747-43fc785c0990" (UID: "e93b7e52-0a4c-4372-a747-43fc785c0990"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.318223 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e93b7e52-0a4c-4372-a747-43fc785c0990-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.474906 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.487638 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-47t7b"] Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.868117 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-9dzjl"] Mar 18 13:30:08 crc kubenswrapper[4912]: I0318 13:30:08.883668 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-9dzjl"] Mar 18 13:30:09 crc kubenswrapper[4912]: I0318 13:30:09.156058 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerStarted","Data":"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046"} Mar 18 13:30:09 crc kubenswrapper[4912]: I0318 13:30:09.156114 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerStarted","Data":"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8"} Mar 18 13:30:10 crc kubenswrapper[4912]: I0318 13:30:10.177697 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerStarted","Data":"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037"} Mar 18 13:30:10 crc kubenswrapper[4912]: I0318 13:30:10.247648 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9970cb0b-ff3a-4850-a612-03c861a5bbf4" path="/var/lib/kubelet/pods/9970cb0b-ff3a-4850-a612-03c861a5bbf4/volumes" Mar 18 13:30:10 crc kubenswrapper[4912]: I0318 13:30:10.248626 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" path="/var/lib/kubelet/pods/e93b7e52-0a4c-4372-a747-43fc785c0990/volumes" Mar 18 13:30:11 crc kubenswrapper[4912]: I0318 13:30:11.193542 4912 generic.go:334] "Generic (PLEG): container finished" podID="77126f44-41a8-416e-b198-fd0242a64bb9" containerID="5e2609728611dffb36be40cc38777d196d9cb3d47a8be3589569abd966d3fb8d" exitCode=0 Mar 18 13:30:11 crc kubenswrapper[4912]: I0318 13:30:11.194016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6qzs" event={"ID":"77126f44-41a8-416e-b198-fd0242a64bb9","Type":"ContainerDied","Data":"5e2609728611dffb36be40cc38777d196d9cb3d47a8be3589569abd966d3fb8d"} Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.215218 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerStarted","Data":"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646"} Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.215752 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.247518 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.609060489 podStartE2EDuration="6.247496548s" podCreationTimestamp="2026-03-18 13:30:06 +0000 UTC" firstStartedPulling="2026-03-18 13:30:07.153330658 +0000 UTC m=+1655.612758103" lastFinishedPulling="2026-03-18 13:30:11.791766737 +0000 UTC m=+1660.251194162" observedRunningTime="2026-03-18 13:30:12.247067777 +0000 UTC m=+1660.706495212" watchObservedRunningTime="2026-03-18 13:30:12.247496548 +0000 UTC m=+1660.706923973" Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.822911 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.997483 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts\") pod \"77126f44-41a8-416e-b198-fd0242a64bb9\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.997821 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvnf\" (UniqueName: \"kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf\") pod \"77126f44-41a8-416e-b198-fd0242a64bb9\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.997955 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle\") pod \"77126f44-41a8-416e-b198-fd0242a64bb9\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " Mar 18 13:30:12 crc kubenswrapper[4912]: I0318 13:30:12.998002 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data\") pod \"77126f44-41a8-416e-b198-fd0242a64bb9\" (UID: \"77126f44-41a8-416e-b198-fd0242a64bb9\") " Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.023471 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf" (OuterVolumeSpecName: "kube-api-access-gwvnf") pod "77126f44-41a8-416e-b198-fd0242a64bb9" (UID: "77126f44-41a8-416e-b198-fd0242a64bb9"). InnerVolumeSpecName "kube-api-access-gwvnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.030007 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts" (OuterVolumeSpecName: "scripts") pod "77126f44-41a8-416e-b198-fd0242a64bb9" (UID: "77126f44-41a8-416e-b198-fd0242a64bb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.061305 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77126f44-41a8-416e-b198-fd0242a64bb9" (UID: "77126f44-41a8-416e-b198-fd0242a64bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.061436 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data" (OuterVolumeSpecName: "config-data") pod "77126f44-41a8-416e-b198-fd0242a64bb9" (UID: "77126f44-41a8-416e-b198-fd0242a64bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.102118 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvnf\" (UniqueName: \"kubernetes.io/projected/77126f44-41a8-416e-b198-fd0242a64bb9-kube-api-access-gwvnf\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.102775 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.102870 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.102935 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77126f44-41a8-416e-b198-fd0242a64bb9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.229213 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6qzs" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.230476 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6qzs" event={"ID":"77126f44-41a8-416e-b198-fd0242a64bb9","Type":"ContainerDied","Data":"e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775"} Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.230573 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e190eadb0fbda758a7e21736f15c42a35712ce0dbeb58da372a39190e4197775" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.453498 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.453565 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.472752 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.473103 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerName="nova-scheduler-scheduler" containerID="cri-o://186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" gracePeriod=30 Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.490731 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.509054 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.509345 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-log" containerID="cri-o://fdf584e6131f430e893a76b345416cc3fc060f6768ab2ecb498d91aed7ce56be" gracePeriod=30 Mar 18 13:30:13 crc kubenswrapper[4912]: I0318 13:30:13.509528 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-metadata" containerID="cri-o://f2a076bd6f7cb3166561c238610017d2f2d91d31f5f7623809bf4d0b1410acb7" gracePeriod=30 Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.274441 4912 generic.go:334] "Generic (PLEG): container finished" podID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerID="fdf584e6131f430e893a76b345416cc3fc060f6768ab2ecb498d91aed7ce56be" exitCode=143 Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.275231 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-log" containerID="cri-o://bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c" gracePeriod=30 Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.275377 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerDied","Data":"fdf584e6131f430e893a76b345416cc3fc060f6768ab2ecb498d91aed7ce56be"} Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.275892 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-api" containerID="cri-o://017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b" gracePeriod=30 Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.286861 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.11:8774/\": EOF" Mar 18 13:30:14 crc kubenswrapper[4912]: I0318 13:30:14.287109 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.11:8774/\": EOF" Mar 18 13:30:14 crc kubenswrapper[4912]: E0318 13:30:14.467672 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:30:14 crc kubenswrapper[4912]: E0318 13:30:14.476406 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:30:14 crc kubenswrapper[4912]: E0318 13:30:14.483703 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:30:14 crc kubenswrapper[4912]: E0318 13:30:14.483819 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerName="nova-scheduler-scheduler" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.333900 4912 generic.go:334] "Generic (PLEG): container finished" podID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerID="bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c" exitCode=143 Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.334095 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerDied","Data":"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c"} Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.338623 4912 generic.go:334] "Generic (PLEG): container finished" podID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerID="186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" exitCode=0 Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.338693 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed","Type":"ContainerDied","Data":"186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e"} Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.652086 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.707740 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle\") pod \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.707853 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data\") pod \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.707879 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvdcd\" (UniqueName: \"kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd\") pod \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\" (UID: \"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed\") " Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.744301 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd" (OuterVolumeSpecName: "kube-api-access-gvdcd") pod "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" (UID: "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed"). InnerVolumeSpecName "kube-api-access-gvdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.789149 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data" (OuterVolumeSpecName: "config-data") pod "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" (UID: "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.812242 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.812585 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvdcd\" (UniqueName: \"kubernetes.io/projected/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-kube-api-access-gvdcd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.821134 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" (UID: "91ce66a9-93da-4cc6-9ed8-2b9f96b330ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:15 crc kubenswrapper[4912]: I0318 13:30:15.915134 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.224833 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.233851 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle\") pod \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.233932 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data\") pod \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.234121 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts\") pod \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.234176 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474n5\" (UniqueName: \"kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5\") pod \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\" (UID: \"00c7cbde-e4fa-42da-98c7-6e3b406326e3\") " Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.239538 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts" (OuterVolumeSpecName: "scripts") pod "00c7cbde-e4fa-42da-98c7-6e3b406326e3" (UID: "00c7cbde-e4fa-42da-98c7-6e3b406326e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.248464 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5" (OuterVolumeSpecName: "kube-api-access-474n5") pod "00c7cbde-e4fa-42da-98c7-6e3b406326e3" (UID: "00c7cbde-e4fa-42da-98c7-6e3b406326e3"). InnerVolumeSpecName "kube-api-access-474n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.338963 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.339006 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-474n5\" (UniqueName: \"kubernetes.io/projected/00c7cbde-e4fa-42da-98c7-6e3b406326e3-kube-api-access-474n5\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.373008 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91ce66a9-93da-4cc6-9ed8-2b9f96b330ed","Type":"ContainerDied","Data":"202daf2a7c70b968bb622456c26f6a47abb1f665f15a4364bd3b3d67e87fa492"} Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.373169 4912 scope.go:117] "RemoveContainer" containerID="186b42d3244e9c85726fe9a2209b1212786962ef4fc70f29c8edb3b86ba4dd3e" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.377597 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399498 4912 generic.go:334] "Generic (PLEG): container finished" podID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerID="2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" exitCode=137 Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399558 4912 generic.go:334] "Generic (PLEG): container finished" podID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerID="370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" exitCode=137 Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399595 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerDied","Data":"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1"} Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399637 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerDied","Data":"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706"} Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399635 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.399648 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"00c7cbde-e4fa-42da-98c7-6e3b406326e3","Type":"ContainerDied","Data":"4fb53b75e467158158daa6a9715652dd091bcbcc78064a1b928bdb56e79ff4ca"} Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.491499 4912 scope.go:117] "RemoveContainer" containerID="2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.503130 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.529431 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.551479 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552722 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="init" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552749 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="init" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552771 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-api" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552778 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-api" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552801 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77126f44-41a8-416e-b198-fd0242a64bb9" containerName="nova-manage" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552809 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="77126f44-41a8-416e-b198-fd0242a64bb9" containerName="nova-manage" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552827 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-notifier" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552836 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-notifier" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552852 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerName="nova-scheduler-scheduler" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552858 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerName="nova-scheduler-scheduler" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552875 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="dnsmasq-dns" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552883 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="dnsmasq-dns" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552898 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-listener" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552907 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-listener" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.552918 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-evaluator" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.552926 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-evaluator" Mar 18 13:30:16 crc kubenswrapper[4912]: E0318 13:30:16.553220 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbccc26-4a01-47c6-a224-7b8355108dfa" containerName="oc" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553229 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbccc26-4a01-47c6-a224-7b8355108dfa" containerName="oc" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553464 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" containerName="nova-scheduler-scheduler" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553488 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93b7e52-0a4c-4372-a747-43fc785c0990" containerName="dnsmasq-dns" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553506 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-notifier" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553520 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-listener" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553535 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="77126f44-41a8-416e-b198-fd0242a64bb9" containerName="nova-manage" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553552 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbccc26-4a01-47c6-a224-7b8355108dfa" containerName="oc" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553565 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-api" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.553576 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" containerName="aodh-evaluator" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.555969 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.561362 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.566240 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data" (OuterVolumeSpecName: "config-data") pod "00c7cbde-e4fa-42da-98c7-6e3b406326e3" (UID: "00c7cbde-e4fa-42da-98c7-6e3b406326e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.612534 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.643359 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00c7cbde-e4fa-42da-98c7-6e3b406326e3" (UID: "00c7cbde-e4fa-42da-98c7-6e3b406326e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.718506 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.718677 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-config-data\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.730694 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h568j\" (UniqueName: \"kubernetes.io/projected/0f957cfd-c546-4a96-a235-ff2d1475ff7a-kube-api-access-h568j\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.731329 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c7cbde-e4fa-42da-98c7-6e3b406326e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.737456 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.778904 4912 scope.go:117] "RemoveContainer" containerID="370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.846196 4912 scope.go:117] "RemoveContainer" containerID="b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.850359 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.850675 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-config-data\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.850818 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h568j\" (UniqueName: \"kubernetes.io/projected/0f957cfd-c546-4a96-a235-ff2d1475ff7a-kube-api-access-h568j\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.865874 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-config-data\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.871963 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f957cfd-c546-4a96-a235-ff2d1475ff7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.881476 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.923144 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h568j\" (UniqueName: \"kubernetes.io/projected/0f957cfd-c546-4a96-a235-ff2d1475ff7a-kube-api-access-h568j\") pod \"nova-scheduler-0\" (UID: \"0f957cfd-c546-4a96-a235-ff2d1475ff7a\") " pod="openstack/nova-scheduler-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.948122 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.980877 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.984491 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.989553 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.989653 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.989828 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.992890 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.993137 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d7sqs" Mar 18 13:30:16 crc kubenswrapper[4912]: I0318 13:30:16.997940 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.038319 4912 scope.go:117] "RemoveContainer" containerID="485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064193 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064272 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064303 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064362 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8x7z\" (UniqueName: \"kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064393 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.064455 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.084610 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.112953 4912 scope.go:117] "RemoveContainer" containerID="2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" Mar 18 13:30:17 crc kubenswrapper[4912]: E0318 13:30:17.113474 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1\": container with ID starting with 2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1 not found: ID does not exist" containerID="2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.113506 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1"} err="failed to get container status \"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1\": rpc error: code = NotFound desc = could not find container \"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1\": container with ID starting with 2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.113532 4912 scope.go:117] "RemoveContainer" containerID="370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" Mar 18 13:30:17 crc kubenswrapper[4912]: E0318 13:30:17.113806 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706\": container with ID starting with 370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706 not found: ID does not exist" containerID="370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.113829 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706"} err="failed to get container status \"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706\": rpc error: code = NotFound desc = could not find container \"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706\": container with ID starting with 370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.113844 4912 scope.go:117] "RemoveContainer" containerID="b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" Mar 18 13:30:17 crc kubenswrapper[4912]: E0318 13:30:17.114146 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8\": container with ID starting with b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8 not found: ID does not exist" containerID="b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114166 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8"} err="failed to get container status \"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8\": rpc error: code = NotFound desc = could not find container \"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8\": container with ID starting with b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114180 4912 scope.go:117] "RemoveContainer" containerID="485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" Mar 18 13:30:17 crc kubenswrapper[4912]: E0318 13:30:17.114409 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513\": container with ID starting with 485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513 not found: ID does not exist" containerID="485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114429 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513"} err="failed to get container status \"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513\": rpc error: code = NotFound desc = could not find container \"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513\": container with ID starting with 485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114441 4912 scope.go:117] "RemoveContainer" containerID="2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114670 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1"} err="failed to get container status \"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1\": rpc error: code = NotFound desc = could not find container \"2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1\": container with ID starting with 2cf3b56ed7d8a1535c7043c912636747be8961c3f707e3447abce6321bc380e1 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114688 4912 scope.go:117] "RemoveContainer" containerID="370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114881 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706"} err="failed to get container status \"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706\": rpc error: code = NotFound desc = could not find container \"370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706\": container with ID starting with 370760d2a306b15dc4b233a03d4d0eba35649e8c37f21099c6655911bc377706 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.114899 4912 scope.go:117] "RemoveContainer" containerID="b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.115102 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8"} err="failed to get container status \"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8\": rpc error: code = NotFound desc = could not find container \"b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8\": container with ID starting with b9229ab56f03f6af6fa969dda4952b9b679dabd998f83521ed34942248ca34f8 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.115120 4912 scope.go:117] "RemoveContainer" containerID="485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.115294 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513"} err="failed to get container status \"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513\": rpc error: code = NotFound desc = could not find container \"485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513\": container with ID starting with 485e98392aa39ec9fb5c802486d7c39e7d37adbd8670e0740c30b509e7d33513 not found: ID does not exist" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167116 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167238 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8x7z\" (UniqueName: \"kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167275 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167293 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167507 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.167534 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.172000 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.172780 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.173065 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.174791 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.175242 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.192716 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8x7z\" (UniqueName: \"kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z\") pod \"aodh-0\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.357798 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.471598 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerDied","Data":"f2a076bd6f7cb3166561c238610017d2f2d91d31f5f7623809bf4d0b1410acb7"} Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.471544 4912 generic.go:334] "Generic (PLEG): container finished" podID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerID="f2a076bd6f7cb3166561c238610017d2f2d91d31f5f7623809bf4d0b1410acb7" exitCode=0 Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.892133 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.996568 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs\") pod \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.996776 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data\") pod \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.996874 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle\") pod \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.996908 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs\") pod \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " Mar 18 13:30:17 crc kubenswrapper[4912]: I0318 13:30:17.996927 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxqk8\" (UniqueName: \"kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8\") pod \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\" (UID: \"19ad23f9-f6b9-4ead-8326-29c9d0537d63\") " Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.001499 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs" (OuterVolumeSpecName: "logs") pod "19ad23f9-f6b9-4ead-8326-29c9d0537d63" (UID: "19ad23f9-f6b9-4ead-8326-29c9d0537d63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.006427 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8" (OuterVolumeSpecName: "kube-api-access-dxqk8") pod "19ad23f9-f6b9-4ead-8326-29c9d0537d63" (UID: "19ad23f9-f6b9-4ead-8326-29c9d0537d63"). InnerVolumeSpecName "kube-api-access-dxqk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.006593 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.046849 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19ad23f9-f6b9-4ead-8326-29c9d0537d63" (UID: "19ad23f9-f6b9-4ead-8326-29c9d0537d63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.052292 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data" (OuterVolumeSpecName: "config-data") pod "19ad23f9-f6b9-4ead-8326-29c9d0537d63" (UID: "19ad23f9-f6b9-4ead-8326-29c9d0537d63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.101213 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "19ad23f9-f6b9-4ead-8326-29c9d0537d63" (UID: "19ad23f9-f6b9-4ead-8326-29c9d0537d63"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.102750 4912 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.102778 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.102792 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ad23f9-f6b9-4ead-8326-29c9d0537d63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.102807 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ad23f9-f6b9-4ead-8326-29c9d0537d63-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.102821 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxqk8\" (UniqueName: \"kubernetes.io/projected/19ad23f9-f6b9-4ead-8326-29c9d0537d63-kube-api-access-dxqk8\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.211816 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.247829 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c7cbde-e4fa-42da-98c7-6e3b406326e3" path="/var/lib/kubelet/pods/00c7cbde-e4fa-42da-98c7-6e3b406326e3/volumes" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.249261 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ce66a9-93da-4cc6-9ed8-2b9f96b330ed" path="/var/lib/kubelet/pods/91ce66a9-93da-4cc6-9ed8-2b9f96b330ed/volumes" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.503020 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f957cfd-c546-4a96-a235-ff2d1475ff7a","Type":"ContainerStarted","Data":"df9530b80c57c531291dbd21d2d61f9df71010d000ab9b93c90d25219a514913"} Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.503112 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f957cfd-c546-4a96-a235-ff2d1475ff7a","Type":"ContainerStarted","Data":"354c6fde167fd4234ae8bbd83b32c13254bbfc9374e046543ad45462d7f94f23"} Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.506418 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerStarted","Data":"62ed403ab2e2ddf594cdd9401c274876b3cc1e35e0ea0ae9bae61772e3d02b06"} Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.510901 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19ad23f9-f6b9-4ead-8326-29c9d0537d63","Type":"ContainerDied","Data":"827e8780f38aad5082c39610b04e91871d68072f9d522ca7f4b4dbc11d6292d7"} Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.510959 4912 scope.go:117] "RemoveContainer" containerID="f2a076bd6f7cb3166561c238610017d2f2d91d31f5f7623809bf4d0b1410acb7" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.511113 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.531931 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.531904845 podStartE2EDuration="2.531904845s" podCreationTimestamp="2026-03-18 13:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:18.5224376 +0000 UTC m=+1666.981865035" watchObservedRunningTime="2026-03-18 13:30:18.531904845 +0000 UTC m=+1666.991332270" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.601942 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.603597 4912 scope.go:117] "RemoveContainer" containerID="fdf584e6131f430e893a76b345416cc3fc060f6768ab2ecb498d91aed7ce56be" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.652229 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.675294 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: E0318 13:30:18.676317 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-metadata" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.676350 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-metadata" Mar 18 13:30:18 crc kubenswrapper[4912]: E0318 13:30:18.676380 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-log" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.676390 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-log" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.676730 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-metadata" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.676767 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" containerName="nova-metadata-log" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.679146 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.684320 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.688512 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.692330 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.732246 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-config-data\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.732332 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.732452 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-logs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.732529 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldznl\" (UniqueName: \"kubernetes.io/projected/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-kube-api-access-ldznl\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.732700 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.834874 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-logs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.834978 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldznl\" (UniqueName: \"kubernetes.io/projected/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-kube-api-access-ldznl\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.835170 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.835261 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-config-data\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.835309 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.840571 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-logs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.841572 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.845372 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.845703 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-config-data\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:18 crc kubenswrapper[4912]: I0318 13:30:18.863735 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldznl\" (UniqueName: \"kubernetes.io/projected/7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0-kube-api-access-ldznl\") pod \"nova-metadata-0\" (UID: \"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0\") " pod="openstack/nova-metadata-0" Mar 18 13:30:19 crc kubenswrapper[4912]: I0318 13:30:19.010496 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:30:19 crc kubenswrapper[4912]: I0318 13:30:19.543990 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerStarted","Data":"779ca3a11a650e6517547026d581476363ff0160365bb2d2c42a48d30d164d8c"} Mar 18 13:30:19 crc kubenswrapper[4912]: I0318 13:30:19.618514 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.255454 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ad23f9-f6b9-4ead-8326-29c9d0537d63" path="/var/lib/kubelet/pods/19ad23f9-f6b9-4ead-8326-29c9d0537d63/volumes" Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.590932 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerStarted","Data":"0c731e46d6f4d6e2d17f95dc8b4ad1074f341a6f67781194e648ef8922c76edc"} Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.608359 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0","Type":"ContainerStarted","Data":"a5dfe644bf31a9c99e3c1498ff9e448938160a347476bf313c8567781b0eaf05"} Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.608423 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0","Type":"ContainerStarted","Data":"7f8043af2e25d8f0d3e240e72e0316c06a008c521b89bba44fb7179ce7ee5464"} Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.608436 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0","Type":"ContainerStarted","Data":"440bf688dea787b230d80b38add80e1e39c846dca0f65bb9cdf1862e11a43582"} Mar 18 13:30:20 crc kubenswrapper[4912]: I0318 13:30:20.693877 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.693843675 podStartE2EDuration="2.693843675s" podCreationTimestamp="2026-03-18 13:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:20.659777999 +0000 UTC m=+1669.119205434" watchObservedRunningTime="2026-03-18 13:30:20.693843675 +0000 UTC m=+1669.153271100" Mar 18 13:30:21 crc kubenswrapper[4912]: I0318 13:30:21.451077 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:30:21 crc kubenswrapper[4912]: I0318 13:30:21.451833 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:30:21 crc kubenswrapper[4912]: I0318 13:30:21.623842 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerStarted","Data":"d92c978d0ab1bdaabec7ab01580be2f34b40cfbc7b1e94e08159cd9fbc124643"} Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.086080 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.301093 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.384561 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.384638 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5vq4\" (UniqueName: \"kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.384678 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.384999 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.385377 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.385636 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.385929 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs" (OuterVolumeSpecName: "logs") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.387278 4912 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5b8f89-623d-448d-9add-fba05ce4f710-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.395251 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4" (OuterVolumeSpecName: "kube-api-access-m5vq4") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "kube-api-access-m5vq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.427672 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.431122 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data" (OuterVolumeSpecName: "config-data") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.477158 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.489525 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.489662 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") pod \"bd5b8f89-623d-448d-9add-fba05ce4f710\" (UID: \"bd5b8f89-623d-448d-9add-fba05ce4f710\") " Mar 18 13:30:22 crc kubenswrapper[4912]: W0318 13:30:22.489799 4912 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bd5b8f89-623d-448d-9add-fba05ce4f710/volumes/kubernetes.io~secret/public-tls-certs Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.489810 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd5b8f89-623d-448d-9add-fba05ce4f710" (UID: "bd5b8f89-623d-448d-9add-fba05ce4f710"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.490378 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.490401 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.490410 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.490421 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5vq4\" (UniqueName: \"kubernetes.io/projected/bd5b8f89-623d-448d-9add-fba05ce4f710-kube-api-access-m5vq4\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.490431 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5b8f89-623d-448d-9add-fba05ce4f710-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.646558 4912 generic.go:334] "Generic (PLEG): container finished" podID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerID="017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b" exitCode=0 Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.646681 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerDied","Data":"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b"} Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.646709 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.647110 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd5b8f89-623d-448d-9add-fba05ce4f710","Type":"ContainerDied","Data":"60c92c02e39382ee3e302d66ee6fb75825e9774191cbb2887f79f4d768f4332c"} Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.647155 4912 scope.go:117] "RemoveContainer" containerID="017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.654191 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerStarted","Data":"0b392ef47e5ad9bb3905e165cf3e5542e2b74e1b2d39bd3d7175b57d598f80c4"} Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.680540 4912 scope.go:117] "RemoveContainer" containerID="bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.690865 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.470638163 podStartE2EDuration="6.690844262s" podCreationTimestamp="2026-03-18 13:30:16 +0000 UTC" firstStartedPulling="2026-03-18 13:30:18.19654599 +0000 UTC m=+1666.655973415" lastFinishedPulling="2026-03-18 13:30:21.416752089 +0000 UTC m=+1669.876179514" observedRunningTime="2026-03-18 13:30:22.688942621 +0000 UTC m=+1671.148370076" watchObservedRunningTime="2026-03-18 13:30:22.690844262 +0000 UTC m=+1671.150271687" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.741598 4912 scope.go:117] "RemoveContainer" containerID="017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b" Mar 18 13:30:22 crc kubenswrapper[4912]: E0318 13:30:22.744141 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b\": container with ID starting with 017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b not found: ID does not exist" containerID="017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.744182 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b"} err="failed to get container status \"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b\": rpc error: code = NotFound desc = could not find container \"017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b\": container with ID starting with 017b746c551020dc15c9364b84e7398036ffef698693fa2ccd86ab263b2d403b not found: ID does not exist" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.744238 4912 scope.go:117] "RemoveContainer" containerID="bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c" Mar 18 13:30:22 crc kubenswrapper[4912]: E0318 13:30:22.747815 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c\": container with ID starting with bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c not found: ID does not exist" containerID="bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.747897 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c"} err="failed to get container status \"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c\": rpc error: code = NotFound desc = could not find container \"bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c\": container with ID starting with bd414e12bcf818c798aae18380ae51e8c19245dc26463aae16d3c33fa052066c not found: ID does not exist" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.762306 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.775557 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.791947 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:22 crc kubenswrapper[4912]: E0318 13:30:22.792784 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-api" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.792804 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-api" Mar 18 13:30:22 crc kubenswrapper[4912]: E0318 13:30:22.792824 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-log" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.792830 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-log" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.793139 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-api" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.793176 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" containerName="nova-api-log" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.794820 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.801899 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.802171 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.802298 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.823952 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.923596 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.923820 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-config-data\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.923991 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-public-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.925052 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.925085 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scprm\" (UniqueName: \"kubernetes.io/projected/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-kube-api-access-scprm\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:22 crc kubenswrapper[4912]: I0318 13:30:22.925122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-logs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027259 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-logs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027487 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027525 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-config-data\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027557 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-public-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027616 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.027635 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scprm\" (UniqueName: \"kubernetes.io/projected/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-kube-api-access-scprm\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.028483 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-logs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.035634 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.037710 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.040544 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-public-tls-certs\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.049217 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scprm\" (UniqueName: \"kubernetes.io/projected/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-kube-api-access-scprm\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.057797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e5f76f-6db6-442d-9e3d-9b8a1de16910-config-data\") pod \"nova-api-0\" (UID: \"b1e5f76f-6db6-442d-9e3d-9b8a1de16910\") " pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.150850 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:30:23 crc kubenswrapper[4912]: I0318 13:30:23.677845 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:30:24 crc kubenswrapper[4912]: I0318 13:30:24.243430 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5b8f89-623d-448d-9add-fba05ce4f710" path="/var/lib/kubelet/pods/bd5b8f89-623d-448d-9add-fba05ce4f710/volumes" Mar 18 13:30:24 crc kubenswrapper[4912]: I0318 13:30:24.703466 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1e5f76f-6db6-442d-9e3d-9b8a1de16910","Type":"ContainerStarted","Data":"712d3dc3bb52e3ddb971b599529fd5b0ff7f426bb3e84d3e9f9b9ec7dbb2f516"} Mar 18 13:30:24 crc kubenswrapper[4912]: I0318 13:30:24.703525 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1e5f76f-6db6-442d-9e3d-9b8a1de16910","Type":"ContainerStarted","Data":"266339cd25a27174b5823249e2be4e96dcb65ea594c54eb89afc1681de2938a4"} Mar 18 13:30:24 crc kubenswrapper[4912]: I0318 13:30:24.703536 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1e5f76f-6db6-442d-9e3d-9b8a1de16910","Type":"ContainerStarted","Data":"b8f15700280ba42f423d35452d3d03a0eb7e86c4f9ec3c5f9797b036c6abd835"} Mar 18 13:30:24 crc kubenswrapper[4912]: I0318 13:30:24.756005 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7559785 podStartE2EDuration="2.7559785s" podCreationTimestamp="2026-03-18 13:30:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:24.743532365 +0000 UTC m=+1673.202959820" watchObservedRunningTime="2026-03-18 13:30:24.7559785 +0000 UTC m=+1673.215405925" Mar 18 13:30:27 crc kubenswrapper[4912]: I0318 13:30:27.085708 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 13:30:27 crc kubenswrapper[4912]: I0318 13:30:27.119322 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 13:30:27 crc kubenswrapper[4912]: I0318 13:30:27.816679 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 13:30:29 crc kubenswrapper[4912]: I0318 13:30:29.011487 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:30:29 crc kubenswrapper[4912]: I0318 13:30:29.012000 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:30:30 crc kubenswrapper[4912]: I0318 13:30:30.030282 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:30:30 crc kubenswrapper[4912]: I0318 13:30:30.030867 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:30:33 crc kubenswrapper[4912]: I0318 13:30:33.153109 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:30:33 crc kubenswrapper[4912]: I0318 13:30:33.153911 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:30:34 crc kubenswrapper[4912]: I0318 13:30:34.175365 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b1e5f76f-6db6-442d-9e3d-9b8a1de16910" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:30:34 crc kubenswrapper[4912]: I0318 13:30:34.175377 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b1e5f76f-6db6-442d-9e3d-9b8a1de16910" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 13:30:36 crc kubenswrapper[4912]: I0318 13:30:36.547241 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 13:30:37 crc kubenswrapper[4912]: I0318 13:30:37.010925 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:30:37 crc kubenswrapper[4912]: I0318 13:30:37.011020 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:30:39 crc kubenswrapper[4912]: I0318 13:30:39.017337 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:30:39 crc kubenswrapper[4912]: I0318 13:30:39.022791 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:30:39 crc kubenswrapper[4912]: I0318 13:30:39.026748 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:30:39 crc kubenswrapper[4912]: I0318 13:30:39.187243 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.003962 4912 scope.go:117] "RemoveContainer" containerID="eb995ecb618883b37121108c37fef475a9529000104dd82d0d3c2796ef4890ae" Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.152287 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.153448 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.642616 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.645540 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" containerName="kube-state-metrics" containerID="cri-o://999547ebf1221aff240a1712baa33835560f7b3afdd29efdf6775cf0deedfdde" gracePeriod=30 Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.750863 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:41 crc kubenswrapper[4912]: I0318 13:30:41.751233 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" containerName="mysqld-exporter" containerID="cri-o://b9f591b5c7fa3912020ec3274b317aff7311520a8700480dbb8179b9d2e81c48" gracePeriod=30 Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.246812 4912 generic.go:334] "Generic (PLEG): container finished" podID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" containerID="999547ebf1221aff240a1712baa33835560f7b3afdd29efdf6775cf0deedfdde" exitCode=2 Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.252908 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5","Type":"ContainerDied","Data":"999547ebf1221aff240a1712baa33835560f7b3afdd29efdf6775cf0deedfdde"} Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.253535 4912 generic.go:334] "Generic (PLEG): container finished" podID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" containerID="b9f591b5c7fa3912020ec3274b317aff7311520a8700480dbb8179b9d2e81c48" exitCode=2 Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.253591 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02","Type":"ContainerDied","Data":"b9f591b5c7fa3912020ec3274b317aff7311520a8700480dbb8179b9d2e81c48"} Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.433633 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.538668 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b544\" (UniqueName: \"kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544\") pod \"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5\" (UID: \"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5\") " Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.547605 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544" (OuterVolumeSpecName: "kube-api-access-8b544") pod "db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" (UID: "db6818da-f93a-45ed-8eb3-2fcd8ddaefd5"). InnerVolumeSpecName "kube-api-access-8b544". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.644055 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b544\" (UniqueName: \"kubernetes.io/projected/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5-kube-api-access-8b544\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.659768 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.746392 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle\") pod \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.746660 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data\") pod \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.746703 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs8dd\" (UniqueName: \"kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd\") pod \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\" (UID: \"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02\") " Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.757482 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd" (OuterVolumeSpecName: "kube-api-access-rs8dd") pod "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" (UID: "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02"). InnerVolumeSpecName "kube-api-access-rs8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.794371 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" (UID: "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.830181 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data" (OuterVolumeSpecName: "config-data") pod "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" (UID: "67d2d27c-ebb6-4e31-ad64-1fc4e5bace02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.853939 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.853984 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:42 crc kubenswrapper[4912]: I0318 13:30:42.853996 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs8dd\" (UniqueName: \"kubernetes.io/projected/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02-kube-api-access-rs8dd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.171700 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.186683 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.199502 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.270417 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db6818da-f93a-45ed-8eb3-2fcd8ddaefd5","Type":"ContainerDied","Data":"5f0835c7937622b13d93069ff7ad874f11e64c220f5ecb71ce17fa4a8f68cdcb"} Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.270495 4912 scope.go:117] "RemoveContainer" containerID="999547ebf1221aff240a1712baa33835560f7b3afdd29efdf6775cf0deedfdde" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.270724 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.275966 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"67d2d27c-ebb6-4e31-ad64-1fc4e5bace02","Type":"ContainerDied","Data":"d1a7d1f86a09f6df82db53112361085de1314c0694a5310e08cb8027de7644fa"} Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.276031 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.298911 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.402224 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.421208 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.423689 4912 scope.go:117] "RemoveContainer" containerID="b9f591b5c7fa3912020ec3274b317aff7311520a8700480dbb8179b9d2e81c48" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.441569 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.472665 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: E0318 13:30:43.473586 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" containerName="mysqld-exporter" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.473606 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" containerName="mysqld-exporter" Mar 18 13:30:43 crc kubenswrapper[4912]: E0318 13:30:43.473627 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" containerName="kube-state-metrics" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.473634 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" containerName="kube-state-metrics" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.473877 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" containerName="mysqld-exporter" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.473902 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" containerName="kube-state-metrics" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.475122 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.487583 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.487879 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.496573 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.520568 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.522635 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.534438 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.534687 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.535197 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.549709 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.598623 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.598691 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8slj\" (UniqueName: \"kubernetes.io/projected/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-api-access-h8slj\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.598732 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.598779 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-config-data\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.599106 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.599331 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.599433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qqcx\" (UniqueName: \"kubernetes.io/projected/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-kube-api-access-9qqcx\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.599691 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.704557 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.704667 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-config-data\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.704775 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.704874 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.704923 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qqcx\" (UniqueName: \"kubernetes.io/projected/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-kube-api-access-9qqcx\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.705065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.705269 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.705307 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8slj\" (UniqueName: \"kubernetes.io/projected/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-api-access-h8slj\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.712848 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-config-data\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.714636 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.716518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.723594 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8slj\" (UniqueName: \"kubernetes.io/projected/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-api-access-h8slj\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.732826 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.735351 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.757728 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qqcx\" (UniqueName: \"kubernetes.io/projected/c50a92bb-6367-46ff-8f61-2cc2418f9f6e-kube-api-access-9qqcx\") pod \"mysqld-exporter-0\" (UID: \"c50a92bb-6367-46ff-8f61-2cc2418f9f6e\") " pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.758515 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afb2214-0d5c-469e-8763-580ea6d84b7d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4afb2214-0d5c-469e-8763-580ea6d84b7d\") " pod="openstack/kube-state-metrics-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.843583 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 13:30:43 crc kubenswrapper[4912]: I0318 13:30:43.862356 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.250500 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d2d27c-ebb6-4e31-ad64-1fc4e5bace02" path="/var/lib/kubelet/pods/67d2d27c-ebb6-4e31-ad64-1fc4e5bace02/volumes" Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.252373 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6818da-f93a-45ed-8eb3-2fcd8ddaefd5" path="/var/lib/kubelet/pods/db6818da-f93a-45ed-8eb3-2fcd8ddaefd5/volumes" Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.533968 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.548572 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.770318 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.770945 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-central-agent" containerID="cri-o://442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8" gracePeriod=30 Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.771576 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="proxy-httpd" containerID="cri-o://463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646" gracePeriod=30 Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.771814 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-notification-agent" containerID="cri-o://e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046" gracePeriod=30 Mar 18 13:30:44 crc kubenswrapper[4912]: I0318 13:30:44.771861 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="sg-core" containerID="cri-o://5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037" gracePeriod=30 Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.312431 4912 generic.go:334] "Generic (PLEG): container finished" podID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerID="463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646" exitCode=0 Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.313258 4912 generic.go:334] "Generic (PLEG): container finished" podID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerID="5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037" exitCode=2 Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.313269 4912 generic.go:334] "Generic (PLEG): container finished" podID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerID="442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8" exitCode=0 Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.312531 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerDied","Data":"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.313357 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerDied","Data":"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.313374 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerDied","Data":"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.314929 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"c50a92bb-6367-46ff-8f61-2cc2418f9f6e","Type":"ContainerStarted","Data":"7239c820e6118e2f0c7fa5f70a24045c1241207e33f37daca9976799d6fb920e"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.317122 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4afb2214-0d5c-469e-8763-580ea6d84b7d","Type":"ContainerStarted","Data":"037ef0387b34eed953911af99035a5d4dee2c820d0ffa917b81caa36a164b3b9"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.317162 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4afb2214-0d5c-469e-8763-580ea6d84b7d","Type":"ContainerStarted","Data":"33531a6fb85aeda695da8fe8fc0c57407b295ef416044a12b0aa0e61b19dcdcf"} Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.317290 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 13:30:45 crc kubenswrapper[4912]: I0318 13:30:45.353062 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.921310478 podStartE2EDuration="2.353010133s" podCreationTimestamp="2026-03-18 13:30:43 +0000 UTC" firstStartedPulling="2026-03-18 13:30:44.551711952 +0000 UTC m=+1693.011139387" lastFinishedPulling="2026-03-18 13:30:44.983411617 +0000 UTC m=+1693.442839042" observedRunningTime="2026-03-18 13:30:45.333708194 +0000 UTC m=+1693.793135629" watchObservedRunningTime="2026-03-18 13:30:45.353010133 +0000 UTC m=+1693.812437558" Mar 18 13:30:46 crc kubenswrapper[4912]: I0318 13:30:46.356830 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"c50a92bb-6367-46ff-8f61-2cc2418f9f6e","Type":"ContainerStarted","Data":"38b841cf8002e06f479c039ecc6356817a60579027bbac24a586e3bd914ca85e"} Mar 18 13:30:46 crc kubenswrapper[4912]: I0318 13:30:46.431150 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.568959809 podStartE2EDuration="3.431122947s" podCreationTimestamp="2026-03-18 13:30:43 +0000 UTC" firstStartedPulling="2026-03-18 13:30:44.551349812 +0000 UTC m=+1693.010777227" lastFinishedPulling="2026-03-18 13:30:45.41351294 +0000 UTC m=+1693.872940365" observedRunningTime="2026-03-18 13:30:46.42417378 +0000 UTC m=+1694.883601205" watchObservedRunningTime="2026-03-18 13:30:46.431122947 +0000 UTC m=+1694.890550402" Mar 18 13:30:48 crc kubenswrapper[4912]: I0318 13:30:48.918791 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.006823 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.007047 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.007476 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.008920 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.009372 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.009424 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.009454 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.009583 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdrqg\" (UniqueName: \"kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg\") pod \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\" (UID: \"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3\") " Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.010099 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.010717 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.010988 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.018026 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts" (OuterVolumeSpecName: "scripts") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.048693 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg" (OuterVolumeSpecName: "kube-api-access-sdrqg") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "kube-api-access-sdrqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.089740 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.113859 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.113895 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.113911 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdrqg\" (UniqueName: \"kubernetes.io/projected/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-kube-api-access-sdrqg\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.181819 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.205035 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data" (OuterVolumeSpecName: "config-data") pod "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" (UID: "ad86d4bc-a71a-4678-ae24-69c2a32ec4c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.221449 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.221506 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.397715 4912 generic.go:334] "Generic (PLEG): container finished" podID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerID="e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046" exitCode=0 Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.397789 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerDied","Data":"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046"} Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.397833 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad86d4bc-a71a-4678-ae24-69c2a32ec4c3","Type":"ContainerDied","Data":"e9f9570454e3f7ae8b37e9c4d8cc8b848ed77262136fd2ccd5c5c662077c041a"} Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.397862 4912 scope.go:117] "RemoveContainer" containerID="463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.398145 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.505931 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.507809 4912 scope.go:117] "RemoveContainer" containerID="5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.518246 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.540788 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.541476 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-notification-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541498 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-notification-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.541540 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-central-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541548 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-central-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.541573 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="proxy-httpd" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541579 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="proxy-httpd" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.541595 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="sg-core" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541601 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="sg-core" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541863 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-notification-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541903 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="sg-core" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541913 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="proxy-httpd" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.541926 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" containerName="ceilometer-central-agent" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.544306 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.556384 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.556627 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.565750 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.584418 4912 scope.go:117] "RemoveContainer" containerID="e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.623886 4912 scope.go:117] "RemoveContainer" containerID="442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.632805 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.634770 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.634981 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635159 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635301 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635390 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635441 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635510 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sllfr\" (UniqueName: \"kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.635563 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.653830 4912 scope.go:117] "RemoveContainer" containerID="463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.654536 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646\": container with ID starting with 463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646 not found: ID does not exist" containerID="463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.654577 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646"} err="failed to get container status \"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646\": rpc error: code = NotFound desc = could not find container \"463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646\": container with ID starting with 463a661a8c5cbc8d2891135dfab125c1e23b324d7cf18ec441a8f3075fd39646 not found: ID does not exist" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.654605 4912 scope.go:117] "RemoveContainer" containerID="5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.655156 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037\": container with ID starting with 5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037 not found: ID does not exist" containerID="5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.655197 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037"} err="failed to get container status \"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037\": rpc error: code = NotFound desc = could not find container \"5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037\": container with ID starting with 5b496a063eb78110e504208effb32d64a9bd78ae621b6b35e647f1e260547037 not found: ID does not exist" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.655218 4912 scope.go:117] "RemoveContainer" containerID="e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.655815 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046\": container with ID starting with e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046 not found: ID does not exist" containerID="e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.655887 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046"} err="failed to get container status \"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046\": rpc error: code = NotFound desc = could not find container \"e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046\": container with ID starting with e08be12d2b47dd004227efb24581124326f02c7980a543218e677951e1a19046 not found: ID does not exist" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.655925 4912 scope.go:117] "RemoveContainer" containerID="442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8" Mar 18 13:30:49 crc kubenswrapper[4912]: E0318 13:30:49.656340 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8\": container with ID starting with 442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8 not found: ID does not exist" containerID="442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.656372 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8"} err="failed to get container status \"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8\": rpc error: code = NotFound desc = could not find container \"442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8\": container with ID starting with 442ba096fb303180e7ea302edd711c466cd0b5164adc45c685e10e31e20751c8 not found: ID does not exist" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.737994 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738123 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738168 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738206 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sllfr\" (UniqueName: \"kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738246 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738299 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738417 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.738535 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.739068 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.739135 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.745681 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.746047 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.746915 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.747020 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.757430 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.764909 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sllfr\" (UniqueName: \"kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr\") pod \"ceilometer-0\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " pod="openstack/ceilometer-0" Mar 18 13:30:49 crc kubenswrapper[4912]: I0318 13:30:49.877952 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:30:50 crc kubenswrapper[4912]: I0318 13:30:50.291529 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad86d4bc-a71a-4678-ae24-69c2a32ec4c3" path="/var/lib/kubelet/pods/ad86d4bc-a71a-4678-ae24-69c2a32ec4c3/volumes" Mar 18 13:30:50 crc kubenswrapper[4912]: I0318 13:30:50.497771 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:30:51 crc kubenswrapper[4912]: I0318 13:30:51.436695 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerStarted","Data":"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87"} Mar 18 13:30:51 crc kubenswrapper[4912]: I0318 13:30:51.437278 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerStarted","Data":"b94cb257f2e6f604d0d3ec77625557d4bebf380f86b726295d6bfe4661722f2c"} Mar 18 13:30:52 crc kubenswrapper[4912]: I0318 13:30:52.452587 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerStarted","Data":"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b"} Mar 18 13:30:53 crc kubenswrapper[4912]: I0318 13:30:53.468342 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerStarted","Data":"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83"} Mar 18 13:30:53 crc kubenswrapper[4912]: I0318 13:30:53.888462 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 13:30:55 crc kubenswrapper[4912]: I0318 13:30:55.523582 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerStarted","Data":"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818"} Mar 18 13:30:55 crc kubenswrapper[4912]: I0318 13:30:55.524552 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:30:55 crc kubenswrapper[4912]: I0318 13:30:55.560046 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.312524306 podStartE2EDuration="6.560020315s" podCreationTimestamp="2026-03-18 13:30:49 +0000 UTC" firstStartedPulling="2026-03-18 13:30:50.533274088 +0000 UTC m=+1698.992701513" lastFinishedPulling="2026-03-18 13:30:54.780770097 +0000 UTC m=+1703.240197522" observedRunningTime="2026-03-18 13:30:55.549991335 +0000 UTC m=+1704.009418770" watchObservedRunningTime="2026-03-18 13:30:55.560020315 +0000 UTC m=+1704.019447740" Mar 18 13:31:07 crc kubenswrapper[4912]: I0318 13:31:06.999355 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:31:07 crc kubenswrapper[4912]: I0318 13:31:06.999993 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:31:19 crc kubenswrapper[4912]: I0318 13:31:19.888594 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.514167 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-j4dhx"] Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.528595 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-j4dhx"] Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.612519 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-4chng"] Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.614967 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.657338 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4chng"] Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.664572 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.664721 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.664775 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68685\" (UniqueName: \"kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.767317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.767471 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.767530 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68685\" (UniqueName: \"kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.777610 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.782429 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.788951 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68685\" (UniqueName: \"kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685\") pod \"heat-db-sync-4chng\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " pod="openstack/heat-db-sync-4chng" Mar 18 13:31:32 crc kubenswrapper[4912]: I0318 13:31:32.957767 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4chng" Mar 18 13:31:33 crc kubenswrapper[4912]: I0318 13:31:33.538176 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4chng"] Mar 18 13:31:33 crc kubenswrapper[4912]: I0318 13:31:33.546822 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:31:34 crc kubenswrapper[4912]: I0318 13:31:34.047808 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4chng" event={"ID":"068bb242-8e37-448f-b647-ee255f9104b9","Type":"ContainerStarted","Data":"06b336cf3287454972be65d86c0f55b0506b8d63d05f3bd8d26705f9a292a3f9"} Mar 18 13:31:34 crc kubenswrapper[4912]: I0318 13:31:34.104935 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:31:34 crc kubenswrapper[4912]: I0318 13:31:34.262711 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="539c384f-3502-4474-9c40-432909696dfb" path="/var/lib/kubelet/pods/539c384f-3502-4474-9c40-432909696dfb/volumes" Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.331598 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.332485 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-central-agent" containerID="cri-o://6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87" gracePeriod=30 Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.333288 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="proxy-httpd" containerID="cri-o://7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818" gracePeriod=30 Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.333350 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="sg-core" containerID="cri-o://7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83" gracePeriod=30 Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.333383 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-notification-agent" containerID="cri-o://5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b" gracePeriod=30 Mar 18 13:31:35 crc kubenswrapper[4912]: I0318 13:31:35.437069 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.110978 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerID="7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818" exitCode=0 Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.111029 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerID="7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83" exitCode=2 Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.111057 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerID="5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b" exitCode=0 Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.111082 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerDied","Data":"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818"} Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.111113 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerDied","Data":"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83"} Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.111123 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerDied","Data":"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b"} Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.882691 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949278 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949359 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949553 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949740 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949785 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949956 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.949983 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.950013 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sllfr\" (UniqueName: \"kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr\") pod \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\" (UID: \"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9\") " Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.953523 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.958719 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.998238 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr" (OuterVolumeSpecName: "kube-api-access-sllfr") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "kube-api-access-sllfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.999477 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:31:36 crc kubenswrapper[4912]: I0318 13:31:36.999576 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.007261 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts" (OuterVolumeSpecName: "scripts") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.040951 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.078737 4912 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.079426 4912 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.079971 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sllfr\" (UniqueName: \"kubernetes.io/projected/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-kube-api-access-sllfr\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.080522 4912 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.080725 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.199558 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data" (OuterVolumeSpecName: "config-data") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.201176 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.202544 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerID="6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87" exitCode=0 Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.202684 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerDied","Data":"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87"} Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.202785 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9","Type":"ContainerDied","Data":"b94cb257f2e6f604d0d3ec77625557d4bebf380f86b726295d6bfe4661722f2c"} Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.202719 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.202882 4912 scope.go:117] "RemoveContainer" containerID="7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.224542 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.271968 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" (UID: "6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.282329 4912 scope.go:117] "RemoveContainer" containerID="7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.305182 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.305238 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.345577 4912 scope.go:117] "RemoveContainer" containerID="5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.410505 4912 scope.go:117] "RemoveContainer" containerID="6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.467978 4912 scope.go:117] "RemoveContainer" containerID="7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.469285 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818\": container with ID starting with 7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818 not found: ID does not exist" containerID="7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.469326 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818"} err="failed to get container status \"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818\": rpc error: code = NotFound desc = could not find container \"7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818\": container with ID starting with 7da9e590da54e7f385acc41543bab5933a8be29bbc9ddcff1b8ce31e61f4c818 not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.469349 4912 scope.go:117] "RemoveContainer" containerID="7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.469671 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83\": container with ID starting with 7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83 not found: ID does not exist" containerID="7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.469694 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83"} err="failed to get container status \"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83\": rpc error: code = NotFound desc = could not find container \"7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83\": container with ID starting with 7039f9dcd7ba5adc80f95c60b0fa3f33c1f9f240856a580c55e446c8dbb63f83 not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.469707 4912 scope.go:117] "RemoveContainer" containerID="5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.470336 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b\": container with ID starting with 5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b not found: ID does not exist" containerID="5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.470365 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b"} err="failed to get container status \"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b\": rpc error: code = NotFound desc = could not find container \"5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b\": container with ID starting with 5704479885f0e7eae8cbdac8e2b0ecef3aa3b7b71ab02181f5654858fa398b6b not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.470383 4912 scope.go:117] "RemoveContainer" containerID="6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.470693 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87\": container with ID starting with 6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87 not found: ID does not exist" containerID="6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.470749 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87"} err="failed to get container status \"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87\": rpc error: code = NotFound desc = could not find container \"6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87\": container with ID starting with 6296e3e8e4816343745e474aa3be9dc67af4dd88b72b90ba651b758723b89d87 not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.567951 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.593791 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.607889 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.608756 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-notification-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.608786 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-notification-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.608811 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="sg-core" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.608821 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="sg-core" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.608875 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="proxy-httpd" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.608882 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="proxy-httpd" Mar 18 13:31:37 crc kubenswrapper[4912]: E0318 13:31:37.608909 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-central-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.608921 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-central-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.609252 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-notification-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.609281 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="proxy-httpd" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.609300 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="sg-core" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.609330 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" containerName="ceilometer-central-agent" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.612510 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.619853 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.623112 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.633816 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.649854 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.721399 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.721738 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.721810 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd2tg\" (UniqueName: \"kubernetes.io/projected/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-kube-api-access-qd2tg\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.721889 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-run-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.721949 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.722023 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-scripts\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.722169 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-config-data\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.722313 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-log-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824464 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-log-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824529 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824589 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824628 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd2tg\" (UniqueName: \"kubernetes.io/projected/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-kube-api-access-qd2tg\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824676 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-run-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824707 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824746 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-scripts\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.824799 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-config-data\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.827991 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-run-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.828360 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-log-httpd\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.831777 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.832439 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.843097 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.846711 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-scripts\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.851208 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd2tg\" (UniqueName: \"kubernetes.io/projected/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-kube-api-access-qd2tg\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.851381 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b-config-data\") pod \"ceilometer-0\" (UID: \"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b\") " pod="openstack/ceilometer-0" Mar 18 13:31:37 crc kubenswrapper[4912]: I0318 13:31:37.949297 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:31:38 crc kubenswrapper[4912]: I0318 13:31:38.255994 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9" path="/var/lib/kubelet/pods/6eb86e6d-7fa5-42a7-a3d8-2bb3b79169f9/volumes" Mar 18 13:31:38 crc kubenswrapper[4912]: I0318 13:31:38.638770 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:31:38 crc kubenswrapper[4912]: W0318 13:31:38.668311 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb35fc5a_3fa3_4bd4_91d8_f5a004576e3b.slice/crio-e538398b7afc07efea5028a7c4961df26255c41ef5dd455bec6acb465975ec4c WatchSource:0}: Error finding container e538398b7afc07efea5028a7c4961df26255c41ef5dd455bec6acb465975ec4c: Status 404 returned error can't find the container with id e538398b7afc07efea5028a7c4961df26255c41ef5dd455bec6acb465975ec4c Mar 18 13:31:39 crc kubenswrapper[4912]: I0318 13:31:39.253744 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"e538398b7afc07efea5028a7c4961df26255c41ef5dd455bec6acb465975ec4c"} Mar 18 13:31:40 crc kubenswrapper[4912]: I0318 13:31:40.690837 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" containerID="cri-o://b15d4fb8c42327da3fbd061f47dd161f5f210106735fd6e8f75cc8b69d650bee" gracePeriod=604794 Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.329053 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" containerID="cri-o://b13a18336ccf9bb2bab92af4c7b6dab70113fa9af0e67388845f718f1c98a197" gracePeriod=604795 Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.374230 4912 scope.go:117] "RemoveContainer" containerID="bc88e5aa51f6871a39dba30b400b276e46d991120d0f21d6375bae33fdf14431" Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.434488 4912 scope.go:117] "RemoveContainer" containerID="e15750ff6eda1e7fdefa6f350be058703d85dff7c6dbea0054fd53ad7c049615" Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.491830 4912 scope.go:117] "RemoveContainer" containerID="93dad7287a0f64e9f789b5e069319da6913ae99cbcb2dc29d179e8dca5505371" Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.588108 4912 scope.go:117] "RemoveContainer" containerID="3e67950d5f8f5d50765891297796ce57b13665db958e646afebba87c1a7856a0" Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.668749 4912 scope.go:117] "RemoveContainer" containerID="8ced65959a177d85fdc3b18853e63692148431992035e9476eb763c588e01c9b" Mar 18 13:31:41 crc kubenswrapper[4912]: I0318 13:31:41.719116 4912 scope.go:117] "RemoveContainer" containerID="fe4129bd358e1716fa75b4bc44fa8f135256f852b8387ebeb8d3fbc096d30410" Mar 18 13:31:44 crc kubenswrapper[4912]: I0318 13:31:44.791752 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 18 13:31:45 crc kubenswrapper[4912]: I0318 13:31:45.035311 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 18 13:31:47 crc kubenswrapper[4912]: I0318 13:31:47.447756 4912 generic.go:334] "Generic (PLEG): container finished" podID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerID="b15d4fb8c42327da3fbd061f47dd161f5f210106735fd6e8f75cc8b69d650bee" exitCode=0 Mar 18 13:31:47 crc kubenswrapper[4912]: I0318 13:31:47.447969 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerDied","Data":"b15d4fb8c42327da3fbd061f47dd161f5f210106735fd6e8f75cc8b69d650bee"} Mar 18 13:31:48 crc kubenswrapper[4912]: I0318 13:31:48.467763 4912 generic.go:334] "Generic (PLEG): container finished" podID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerID="b13a18336ccf9bb2bab92af4c7b6dab70113fa9af0e67388845f718f1c98a197" exitCode=0 Mar 18 13:31:48 crc kubenswrapper[4912]: I0318 13:31:48.467882 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerDied","Data":"b13a18336ccf9bb2bab92af4c7b6dab70113fa9af0e67388845f718f1c98a197"} Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.514293 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f54097ed-90b5-4369-8304-8bcb3a7d1839","Type":"ContainerDied","Data":"c459d755036419f0da01ec84540face151649cffb8107c677ddd6d314f2c9481"} Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.515292 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c459d755036419f0da01ec84540face151649cffb8107c677ddd6d314f2c9481" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.649571 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.835265 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.835839 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.835973 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.836181 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.836442 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.836578 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.836727 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6tb\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.836857 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.837058 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.837197 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.837299 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") " Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.843265 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.866237 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.887370 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb" (OuterVolumeSpecName: "kube-api-access-rr6tb") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "kube-api-access-rr6tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.888707 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.892917 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.903372 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.925149 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info" (OuterVolumeSpecName: "pod-info") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.937782 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:31:50 crc kubenswrapper[4912]: E0318 13:31:50.938508 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.938528 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" Mar 18 13:31:50 crc kubenswrapper[4912]: E0318 13:31:50.938544 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="setup-container" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.938551 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="setup-container" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.939795 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" containerName="rabbitmq" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941848 4912 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54097ed-90b5-4369-8304-8bcb3a7d1839-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941892 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941905 4912 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54097ed-90b5-4369-8304-8bcb3a7d1839-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941915 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6tb\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-kube-api-access-rr6tb\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941926 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941935 4912 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.941944 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.942165 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.953521 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.976845 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data" (OuterVolumeSpecName: "config-data") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:50 crc kubenswrapper[4912]: I0318 13:31:50.981746 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.045605 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734" (OuterVolumeSpecName: "persistence") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:31:51 crc kubenswrapper[4912]: E0318 13:31:51.051253 4912 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/vol_data.json]: open /var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"f54097ed-90b5-4369-8304-8bcb3a7d1839\" (UID: \"f54097ed-90b5-4369-8304-8bcb3a7d1839\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/vol_data.json]: open /var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes/kubernetes.io~csi/pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734/vol_data.json: no such file or directory" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.051926 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.051955 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnnf7\" (UniqueName: \"kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052012 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052074 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052174 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052305 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052451 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052653 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.052689 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") on node \"crc\" " Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.112899 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf" (OuterVolumeSpecName: "server-conf") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155000 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnnf7\" (UniqueName: \"kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155073 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155118 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155158 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155215 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155253 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155337 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.155919 4912 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54097ed-90b5-4369-8304-8bcb3a7d1839-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.156922 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.158874 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.174437 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.179171 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.184496 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.185135 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.203760 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnnf7\" (UniqueName: \"kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7\") pod \"dnsmasq-dns-5b75489c6f-9ftqr\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.234614 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.235109 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734") on node "crc" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.240320 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f54097ed-90b5-4369-8304-8bcb3a7d1839" (UID: "f54097ed-90b5-4369-8304-8bcb3a7d1839"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.258347 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54097ed-90b5-4369-8304-8bcb3a7d1839-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.258854 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.455866 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.532239 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.595431 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.612882 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.630238 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.635394 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.675751 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777031 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxk4\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-kube-api-access-fsxk4\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777143 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777174 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777211 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777250 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-config-data\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777276 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777306 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777881 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777948 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.777979 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.778155 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881372 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881502 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxk4\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-kube-api-access-fsxk4\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881557 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881586 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881616 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881643 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-config-data\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881668 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881694 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881819 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881840 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.881861 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.882587 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.883435 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.883792 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-config-data\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.884420 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.885212 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.888903 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.890960 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.891129 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5aa3e0c5b60af900e5d16575c4249b7a6e9c51067aa146eb784e89b8270a4c65/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.891476 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.892219 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.899819 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxk4\" (UniqueName: \"kubernetes.io/projected/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-kube-api-access-fsxk4\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.905018 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:51 crc kubenswrapper[4912]: I0318 13:31:51.970228 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10b367bc-3f5c-47e9-b4c4-4dd784609734\") pod \"rabbitmq-server-2\" (UID: \"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0\") " pod="openstack/rabbitmq-server-2" Mar 18 13:31:52 crc kubenswrapper[4912]: I0318 13:31:52.247427 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54097ed-90b5-4369-8304-8bcb3a7d1839" path="/var/lib/kubelet/pods/f54097ed-90b5-4369-8304-8bcb3a7d1839/volumes" Mar 18 13:31:52 crc kubenswrapper[4912]: I0318 13:31:52.267866 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.789548 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.960256 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.960425 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.961712 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.961756 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.961827 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.962070 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.962091 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.962126 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.962311 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.966151 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.966256 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5w99\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99\") pod \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\" (UID: \"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1\") " Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.969396 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.969901 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info" (OuterVolumeSpecName: "pod-info") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.970010 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.972696 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99" (OuterVolumeSpecName: "kube-api-access-s5w99") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "kube-api-access-s5w99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.973157 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.984196 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:56 crc kubenswrapper[4912]: I0318 13:31:56.986293 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.019205 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8" (OuterVolumeSpecName: "persistence") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "pvc-471b2c87-cae0-43f0-a800-be93522bcfd8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075761 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075825 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") on node \"crc\" " Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075874 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5w99\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-kube-api-access-s5w99\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075890 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075900 4912 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075910 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075922 4912 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.075931 4912 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.088312 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf" (OuterVolumeSpecName: "server-conf") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.128299 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data" (OuterVolumeSpecName: "config-data") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.158610 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.162625 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-471b2c87-cae0-43f0-a800-be93522bcfd8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8") on node "crc" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.178737 4912 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.178785 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.178799 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.234966 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" (UID: "e3064ccc-dad8-4d19-9c71-03b0e7d32bb1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.283155 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.640115 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3064ccc-dad8-4d19-9c71-03b0e7d32bb1","Type":"ContainerDied","Data":"729845b90d8dc95720368f0bd984e68203789c173fb352c3f3906c683bad49ed"} Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.640194 4912 scope.go:117] "RemoveContainer" containerID="b13a18336ccf9bb2bab92af4c7b6dab70113fa9af0e67388845f718f1c98a197" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.640213 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.795121 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.812747 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.872273 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:57 crc kubenswrapper[4912]: E0318 13:31:57.873386 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="setup-container" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.873787 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="setup-container" Mar 18 13:31:57 crc kubenswrapper[4912]: E0318 13:31:57.873883 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.873952 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.874333 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.876051 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.895315 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.895595 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.896077 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.896430 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.896735 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-plsnj" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.896982 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.912796 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 13:31:57 crc kubenswrapper[4912]: I0318 13:31:57.956796 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019084 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019171 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019251 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019301 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019376 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019400 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6sb\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-kube-api-access-cf6sb\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019453 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019528 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019563 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.019775 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.123269 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.123684 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.123822 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.123939 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.124281 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.124306 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.124541 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.125443 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.125581 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.125718 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.125906 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.126025 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6sb\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-kube-api-access-cf6sb\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.126190 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.124568 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.127750 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.129966 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.135797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.136025 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.138745 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.139596 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.159003 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6sb\" (UniqueName: \"kubernetes.io/projected/7b0b7b32-0583-4813-b9fd-9697bf4e9d05-kube-api-access-cf6sb\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.174058 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.174390 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e1887cd5a1a70a198c286e0ab70ed1f31939a231341948609d08078c9331ef7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: E0318 13:31:58.240186 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 18 13:31:58 crc kubenswrapper[4912]: E0318 13:31:58.240289 4912 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 18 13:31:58 crc kubenswrapper[4912]: E0318 13:31:58.240473 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68685,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-4chng_openstack(068bb242-8e37-448f-b647-ee255f9104b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:31:58 crc kubenswrapper[4912]: E0318 13:31:58.241773 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-4chng" podUID="068bb242-8e37-448f-b647-ee255f9104b9" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.254263 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-471b2c87-cae0-43f0-a800-be93522bcfd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"7b0b7b32-0583-4813-b9fd-9697bf4e9d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.256466 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" path="/var/lib/kubelet/pods/e3064ccc-dad8-4d19-9c71-03b0e7d32bb1/volumes" Mar 18 13:31:58 crc kubenswrapper[4912]: I0318 13:31:58.553754 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:58 crc kubenswrapper[4912]: E0318 13:31:58.656299 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-4chng" podUID="068bb242-8e37-448f-b647-ee255f9104b9" Mar 18 13:31:59 crc kubenswrapper[4912]: I0318 13:31:59.791988 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e3064ccc-dad8-4d19-9c71-03b0e7d32bb1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: i/o timeout" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.195324 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564012-ddbbq"] Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.199721 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.203654 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.204139 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.206846 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.253152 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-ddbbq"] Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.311264 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8l6r\" (UniqueName: \"kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r\") pod \"auto-csr-approver-29564012-ddbbq\" (UID: \"eca1374f-79dd-46c8-8486-def1a8f2b9ef\") " pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.416296 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8l6r\" (UniqueName: \"kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r\") pod \"auto-csr-approver-29564012-ddbbq\" (UID: \"eca1374f-79dd-46c8-8486-def1a8f2b9ef\") " pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.439199 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8l6r\" (UniqueName: \"kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r\") pod \"auto-csr-approver-29564012-ddbbq\" (UID: \"eca1374f-79dd-46c8-8486-def1a8f2b9ef\") " pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.546132 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:00 crc kubenswrapper[4912]: E0318 13:32:00.593421 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 18 13:32:00 crc kubenswrapper[4912]: E0318 13:32:00.593863 4912 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 18 13:32:00 crc kubenswrapper[4912]: E0318 13:32:00.594015 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55bh79h57h7fhcfh5d8h579h59h5c5h576h59h55hf5hbbh8dh7h5d5h5fch97hbchdch674hb4h5b6h8dh56h694hffh56fh5h58fh5b8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qd2tg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:32:00 crc kubenswrapper[4912]: I0318 13:32:00.594349 4912 scope.go:117] "RemoveContainer" containerID="89b9b0744f0defd6a923dad45fc73dbb7c753a1d60556a747054c707ca6490fd" Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.204814 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.309928 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.327250 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-ddbbq"] Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.342450 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:32:01 crc kubenswrapper[4912]: W0318 13:32:01.343542 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0b7b32_0583_4813_b9fd_9697bf4e9d05.slice/crio-e0cac492c19385322c7dc49a0ae4557f2bdfdbf771946b0b417651496c58983d WatchSource:0}: Error finding container e0cac492c19385322c7dc49a0ae4557f2bdfdbf771946b0b417651496c58983d: Status 404 returned error can't find the container with id e0cac492c19385322c7dc49a0ae4557f2bdfdbf771946b0b417651496c58983d Mar 18 13:32:01 crc kubenswrapper[4912]: W0318 13:32:01.352166 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3728348_e4ca_4584_b717_c8c7a506c8c6.slice/crio-d407b3b45d558387c7bb74e9f234efc100d28022e85b9e8aa79a4696c600719a WatchSource:0}: Error finding container d407b3b45d558387c7bb74e9f234efc100d28022e85b9e8aa79a4696c600719a: Status 404 returned error can't find the container with id d407b3b45d558387c7bb74e9f234efc100d28022e85b9e8aa79a4696c600719a Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.707238 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0","Type":"ContainerStarted","Data":"5f2db37921b63fd2700c1a8f8f8ae4f0e77a1037384381845fb04070613cdf73"} Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.710016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" event={"ID":"eca1374f-79dd-46c8-8486-def1a8f2b9ef","Type":"ContainerStarted","Data":"a09b07747a5064747f425312c84d1b147da20b08961a63ae70388f4190205f9c"} Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.719536 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" event={"ID":"f3728348-e4ca-4584-b717-c8c7a506c8c6","Type":"ContainerStarted","Data":"d407b3b45d558387c7bb74e9f234efc100d28022e85b9e8aa79a4696c600719a"} Mar 18 13:32:01 crc kubenswrapper[4912]: I0318 13:32:01.720467 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7b0b7b32-0583-4813-b9fd-9697bf4e9d05","Type":"ContainerStarted","Data":"e0cac492c19385322c7dc49a0ae4557f2bdfdbf771946b0b417651496c58983d"} Mar 18 13:32:02 crc kubenswrapper[4912]: I0318 13:32:02.737304 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"192a4fe4d7d6b308de664663b94cdeb786c90d27ff5c1e6531870c5938e51786"} Mar 18 13:32:02 crc kubenswrapper[4912]: I0318 13:32:02.737993 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"85bebf19eadcbb7c1d429bea8ea863e81ee3552ea2cccffab03a2d547e83d613"} Mar 18 13:32:02 crc kubenswrapper[4912]: I0318 13:32:02.743657 4912 generic.go:334] "Generic (PLEG): container finished" podID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerID="7086ca245df10630b0e0ea50364a00edb3e8124c5b6b4be2b43e47289ef658cc" exitCode=0 Mar 18 13:32:02 crc kubenswrapper[4912]: I0318 13:32:02.743715 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" event={"ID":"f3728348-e4ca-4584-b717-c8c7a506c8c6","Type":"ContainerDied","Data":"7086ca245df10630b0e0ea50364a00edb3e8124c5b6b4be2b43e47289ef658cc"} Mar 18 13:32:04 crc kubenswrapper[4912]: E0318 13:32:04.790570 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.795850 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" event={"ID":"f3728348-e4ca-4584-b717-c8c7a506c8c6","Type":"ContainerStarted","Data":"f7f394c1de75cbbea306b8ac50934c7e3e0c860a8d50ade29ee18dc5ae82b316"} Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.795966 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.799140 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7b0b7b32-0583-4813-b9fd-9697bf4e9d05","Type":"ContainerStarted","Data":"830b838f3d0f8dea544bd4482f9b7c50c80722c238097cc37abe82374c790066"} Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.802136 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0","Type":"ContainerStarted","Data":"138b40e80d75cab54c3610ca9e61e4f98c098b246479fa737f8af03c998fbad5"} Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.806342 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" event={"ID":"eca1374f-79dd-46c8-8486-def1a8f2b9ef","Type":"ContainerStarted","Data":"419380edd8e17c20b3558a4a9dd04e037afedfeac12420ab2d0f189658a84897"} Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.831656 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" podStartSLOduration=14.831627427 podStartE2EDuration="14.831627427s" podCreationTimestamp="2026-03-18 13:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:04.824562908 +0000 UTC m=+1773.283990343" watchObservedRunningTime="2026-03-18 13:32:04.831627427 +0000 UTC m=+1773.291054852" Mar 18 13:32:04 crc kubenswrapper[4912]: I0318 13:32:04.882635 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" podStartSLOduration=2.432076489 podStartE2EDuration="4.882605978s" podCreationTimestamp="2026-03-18 13:32:00 +0000 UTC" firstStartedPulling="2026-03-18 13:32:01.344494421 +0000 UTC m=+1769.803921846" lastFinishedPulling="2026-03-18 13:32:03.79502391 +0000 UTC m=+1772.254451335" observedRunningTime="2026-03-18 13:32:04.875179658 +0000 UTC m=+1773.334607083" watchObservedRunningTime="2026-03-18 13:32:04.882605978 +0000 UTC m=+1773.342033413" Mar 18 13:32:05 crc kubenswrapper[4912]: I0318 13:32:05.820152 4912 generic.go:334] "Generic (PLEG): container finished" podID="eca1374f-79dd-46c8-8486-def1a8f2b9ef" containerID="419380edd8e17c20b3558a4a9dd04e037afedfeac12420ab2d0f189658a84897" exitCode=0 Mar 18 13:32:05 crc kubenswrapper[4912]: I0318 13:32:05.820253 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" event={"ID":"eca1374f-79dd-46c8-8486-def1a8f2b9ef","Type":"ContainerDied","Data":"419380edd8e17c20b3558a4a9dd04e037afedfeac12420ab2d0f189658a84897"} Mar 18 13:32:05 crc kubenswrapper[4912]: I0318 13:32:05.823243 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"7996c844cf564b10b9ce8b562bc6955b11cc5fed9a0b1942b5139a92ecf9d57d"} Mar 18 13:32:05 crc kubenswrapper[4912]: E0318 13:32:05.826598 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.838329 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:32:06 crc kubenswrapper[4912]: E0318 13:32:06.839973 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.998379 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.998458 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.998517 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.999797 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:32:06 crc kubenswrapper[4912]: I0318 13:32:06.999852 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" gracePeriod=600 Mar 18 13:32:07 crc kubenswrapper[4912]: E0318 13:32:07.150676 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.341548 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.509291 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8l6r\" (UniqueName: \"kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r\") pod \"eca1374f-79dd-46c8-8486-def1a8f2b9ef\" (UID: \"eca1374f-79dd-46c8-8486-def1a8f2b9ef\") " Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.519305 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r" (OuterVolumeSpecName: "kube-api-access-r8l6r") pod "eca1374f-79dd-46c8-8486-def1a8f2b9ef" (UID: "eca1374f-79dd-46c8-8486-def1a8f2b9ef"). InnerVolumeSpecName "kube-api-access-r8l6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.613580 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8l6r\" (UniqueName: \"kubernetes.io/projected/eca1374f-79dd-46c8-8486-def1a8f2b9ef-kube-api-access-r8l6r\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.855227 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.856858 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-ddbbq" event={"ID":"eca1374f-79dd-46c8-8486-def1a8f2b9ef","Type":"ContainerDied","Data":"a09b07747a5064747f425312c84d1b147da20b08961a63ae70388f4190205f9c"} Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.857060 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09b07747a5064747f425312c84d1b147da20b08961a63ae70388f4190205f9c" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.859906 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" exitCode=0 Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.860572 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6"} Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.860721 4912 scope.go:117] "RemoveContainer" containerID="b9b062d12671b37c0b8053de923c0a7e885a374a0f869eb2369e0f8236cbd4db" Mar 18 13:32:07 crc kubenswrapper[4912]: I0318 13:32:07.862151 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:32:07 crc kubenswrapper[4912]: E0318 13:32:07.862597 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:07 crc kubenswrapper[4912]: E0318 13:32:07.863504 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" Mar 18 13:32:08 crc kubenswrapper[4912]: I0318 13:32:08.432890 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-nt9g8"] Mar 18 13:32:08 crc kubenswrapper[4912]: I0318 13:32:08.445943 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-nt9g8"] Mar 18 13:32:10 crc kubenswrapper[4912]: I0318 13:32:10.249358 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366d4bcf-ceeb-48e7-a834-c579166036b6" path="/var/lib/kubelet/pods/366d4bcf-ceeb-48e7-a834-c579166036b6/volumes" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.458295 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.577189 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.587641 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="dnsmasq-dns" containerID="cri-o://0d651a2c69bf53833e90e04ae8ddb21cd68b4525cfcf7e3d5e48531edd7ab09b" gracePeriod=10 Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.961555 4912 generic.go:334] "Generic (PLEG): container finished" podID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerID="0d651a2c69bf53833e90e04ae8ddb21cd68b4525cfcf7e3d5e48531edd7ab09b" exitCode=0 Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.961924 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" event={"ID":"8bd4ffa4-2efe-40eb-98ab-231ef0283598","Type":"ContainerDied","Data":"0d651a2c69bf53833e90e04ae8ddb21cd68b4525cfcf7e3d5e48531edd7ab09b"} Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.966054 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-8qfhh"] Mar 18 13:32:11 crc kubenswrapper[4912]: E0318 13:32:11.966732 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca1374f-79dd-46c8-8486-def1a8f2b9ef" containerName="oc" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.966758 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca1374f-79dd-46c8-8486-def1a8f2b9ef" containerName="oc" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.980397 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca1374f-79dd-46c8-8486-def1a8f2b9ef" containerName="oc" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.986182 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4chng" event={"ID":"068bb242-8e37-448f-b647-ee255f9104b9","Type":"ContainerStarted","Data":"ad249e259e6551d81edf9dc8e6289145da4a2416361639f9e86448849c665ba3"} Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.986442 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:11 crc kubenswrapper[4912]: I0318 13:32:11.996018 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-8qfhh"] Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.052899 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-4chng" podStartSLOduration=2.181572764 podStartE2EDuration="40.052867581s" podCreationTimestamp="2026-03-18 13:31:32 +0000 UTC" firstStartedPulling="2026-03-18 13:31:33.546546921 +0000 UTC m=+1742.005974346" lastFinishedPulling="2026-03-18 13:32:11.417841748 +0000 UTC m=+1779.877269163" observedRunningTime="2026-03-18 13:32:12.014846488 +0000 UTC m=+1780.474273913" watchObservedRunningTime="2026-03-18 13:32:12.052867581 +0000 UTC m=+1780.512295026" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.067945 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcj7w\" (UniqueName: \"kubernetes.io/projected/0db2a85b-f256-4ef8-b380-5831240903c7-kube-api-access-xcj7w\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068020 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068432 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068462 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-config\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068570 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068595 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.068676 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172291 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcj7w\" (UniqueName: \"kubernetes.io/projected/0db2a85b-f256-4ef8-b380-5831240903c7-kube-api-access-xcj7w\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172368 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172422 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172443 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-config\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172520 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172537 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.172617 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.173661 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.174300 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-config\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.174436 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.174718 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.174764 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.175218 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2a85b-f256-4ef8-b380-5831240903c7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.201550 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcj7w\" (UniqueName: \"kubernetes.io/projected/0db2a85b-f256-4ef8-b380-5831240903c7-kube-api-access-xcj7w\") pod \"dnsmasq-dns-5d75f767dc-8qfhh\" (UID: \"0db2a85b-f256-4ef8-b380-5831240903c7\") " pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.253924 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.345185 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.377717 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9rzw\" (UniqueName: \"kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.378382 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.378459 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.378506 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.378723 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.378829 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb\") pod \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\" (UID: \"8bd4ffa4-2efe-40eb-98ab-231ef0283598\") " Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.384789 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw" (OuterVolumeSpecName: "kube-api-access-n9rzw") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "kube-api-access-n9rzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.468271 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.469262 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config" (OuterVolumeSpecName: "config") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.482781 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.488580 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.500210 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9rzw\" (UniqueName: \"kubernetes.io/projected/8bd4ffa4-2efe-40eb-98ab-231ef0283598-kube-api-access-n9rzw\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.500245 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.500261 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.493344 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.533005 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bd4ffa4-2efe-40eb-98ab-231ef0283598" (UID: "8bd4ffa4-2efe-40eb-98ab-231ef0283598"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.605284 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.605851 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bd4ffa4-2efe-40eb-98ab-231ef0283598-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.986822 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" event={"ID":"8bd4ffa4-2efe-40eb-98ab-231ef0283598","Type":"ContainerDied","Data":"c5bdba073f1b259a6dcc5d3078bcf0ecbe0ed45885da45ba4ba829507a570448"} Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.986904 4912 scope.go:117] "RemoveContainer" containerID="0d651a2c69bf53833e90e04ae8ddb21cd68b4525cfcf7e3d5e48531edd7ab09b" Mar 18 13:32:12 crc kubenswrapper[4912]: I0318 13:32:12.986927 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-92qkk" Mar 18 13:32:13 crc kubenswrapper[4912]: I0318 13:32:13.019181 4912 scope.go:117] "RemoveContainer" containerID="c44e2c6a426590aab32be8162631d5f44bce924c5728f58a465190cb4f5717ce" Mar 18 13:32:13 crc kubenswrapper[4912]: I0318 13:32:13.048708 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:32:13 crc kubenswrapper[4912]: I0318 13:32:13.064706 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-8qfhh"] Mar 18 13:32:13 crc kubenswrapper[4912]: I0318 13:32:13.082460 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-92qkk"] Mar 18 13:32:14 crc kubenswrapper[4912]: I0318 13:32:14.005211 4912 generic.go:334] "Generic (PLEG): container finished" podID="0db2a85b-f256-4ef8-b380-5831240903c7" containerID="c96512991e8326a2de9d7125b0b566dc6dacfbd68830e6d1a5f5e59df964aedc" exitCode=0 Mar 18 13:32:14 crc kubenswrapper[4912]: I0318 13:32:14.007143 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" event={"ID":"0db2a85b-f256-4ef8-b380-5831240903c7","Type":"ContainerDied","Data":"c96512991e8326a2de9d7125b0b566dc6dacfbd68830e6d1a5f5e59df964aedc"} Mar 18 13:32:14 crc kubenswrapper[4912]: I0318 13:32:14.007178 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" event={"ID":"0db2a85b-f256-4ef8-b380-5831240903c7","Type":"ContainerStarted","Data":"99496770ff74e52e3310f88d41727fb1d625d3e03ffa49e116136eb0a34ccb31"} Mar 18 13:32:14 crc kubenswrapper[4912]: I0318 13:32:14.246850 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" path="/var/lib/kubelet/pods/8bd4ffa4-2efe-40eb-98ab-231ef0283598/volumes" Mar 18 13:32:15 crc kubenswrapper[4912]: I0318 13:32:15.030180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" event={"ID":"0db2a85b-f256-4ef8-b380-5831240903c7","Type":"ContainerStarted","Data":"021bbb99a1c76e25cfb1ac16c12954e3935c4058dd6de6c6cc3b8c434f087896"} Mar 18 13:32:15 crc kubenswrapper[4912]: I0318 13:32:15.030984 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:15 crc kubenswrapper[4912]: I0318 13:32:15.072989 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" podStartSLOduration=4.072964242 podStartE2EDuration="4.072964242s" podCreationTimestamp="2026-03-18 13:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:15.053742155 +0000 UTC m=+1783.513169600" watchObservedRunningTime="2026-03-18 13:32:15.072964242 +0000 UTC m=+1783.532391667" Mar 18 13:32:16 crc kubenswrapper[4912]: I0318 13:32:16.047604 4912 generic.go:334] "Generic (PLEG): container finished" podID="068bb242-8e37-448f-b647-ee255f9104b9" containerID="ad249e259e6551d81edf9dc8e6289145da4a2416361639f9e86448849c665ba3" exitCode=0 Mar 18 13:32:16 crc kubenswrapper[4912]: I0318 13:32:16.047722 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4chng" event={"ID":"068bb242-8e37-448f-b647-ee255f9104b9","Type":"ContainerDied","Data":"ad249e259e6551d81edf9dc8e6289145da4a2416361639f9e86448849c665ba3"} Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.569502 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4chng" Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.693451 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68685\" (UniqueName: \"kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685\") pod \"068bb242-8e37-448f-b647-ee255f9104b9\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.693689 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data\") pod \"068bb242-8e37-448f-b647-ee255f9104b9\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.693766 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle\") pod \"068bb242-8e37-448f-b647-ee255f9104b9\" (UID: \"068bb242-8e37-448f-b647-ee255f9104b9\") " Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.716324 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685" (OuterVolumeSpecName: "kube-api-access-68685") pod "068bb242-8e37-448f-b647-ee255f9104b9" (UID: "068bb242-8e37-448f-b647-ee255f9104b9"). InnerVolumeSpecName "kube-api-access-68685". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.801207 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68685\" (UniqueName: \"kubernetes.io/projected/068bb242-8e37-448f-b647-ee255f9104b9-kube-api-access-68685\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.865356 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "068bb242-8e37-448f-b647-ee255f9104b9" (UID: "068bb242-8e37-448f-b647-ee255f9104b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.916169 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:17 crc kubenswrapper[4912]: I0318 13:32:17.966234 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data" (OuterVolumeSpecName: "config-data") pod "068bb242-8e37-448f-b647-ee255f9104b9" (UID: "068bb242-8e37-448f-b647-ee255f9104b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:18 crc kubenswrapper[4912]: I0318 13:32:18.019659 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/068bb242-8e37-448f-b647-ee255f9104b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:18 crc kubenswrapper[4912]: I0318 13:32:18.078139 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4chng" event={"ID":"068bb242-8e37-448f-b647-ee255f9104b9","Type":"ContainerDied","Data":"06b336cf3287454972be65d86c0f55b0506b8d63d05f3bd8d26705f9a292a3f9"} Mar 18 13:32:18 crc kubenswrapper[4912]: I0318 13:32:18.078196 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b336cf3287454972be65d86c0f55b0506b8d63d05f3bd8d26705f9a292a3f9" Mar 18 13:32:18 crc kubenswrapper[4912]: I0318 13:32:18.078301 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4chng" Mar 18 13:32:18 crc kubenswrapper[4912]: I0318 13:32:18.228344 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:32:18 crc kubenswrapper[4912]: E0318 13:32:18.229100 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.300625 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-84cf4c78d4-lld2l"] Mar 18 13:32:19 crc kubenswrapper[4912]: E0318 13:32:19.301743 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068bb242-8e37-448f-b647-ee255f9104b9" containerName="heat-db-sync" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.301763 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="068bb242-8e37-448f-b647-ee255f9104b9" containerName="heat-db-sync" Mar 18 13:32:19 crc kubenswrapper[4912]: E0318 13:32:19.301784 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="dnsmasq-dns" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.301796 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="dnsmasq-dns" Mar 18 13:32:19 crc kubenswrapper[4912]: E0318 13:32:19.301831 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="init" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.301839 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="init" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.302193 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd4ffa4-2efe-40eb-98ab-231ef0283598" containerName="dnsmasq-dns" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.302229 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="068bb242-8e37-448f-b647-ee255f9104b9" containerName="heat-db-sync" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.303373 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.323742 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-84cf4c78d4-lld2l"] Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.362869 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.363122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zxq5\" (UniqueName: \"kubernetes.io/projected/4569488e-f272-42ff-a887-5eb7d399cece-kube-api-access-2zxq5\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.363433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-combined-ca-bundle\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.363563 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data-custom\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.435361 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6cb8998785-grlzs"] Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.437280 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.457698 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cb8998785-grlzs"] Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.469783 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.469874 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zxq5\" (UniqueName: \"kubernetes.io/projected/4569488e-f272-42ff-a887-5eb7d399cece-kube-api-access-2zxq5\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.469972 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-combined-ca-bundle\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.470016 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data-custom\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.482904 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.483419 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5d44c98778-8qzdk"] Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.484998 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-config-data-custom\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.485641 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.495789 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4569488e-f272-42ff-a887-5eb7d399cece-combined-ca-bundle\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.500247 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5d44c98778-8qzdk"] Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.513938 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zxq5\" (UniqueName: \"kubernetes.io/projected/4569488e-f272-42ff-a887-5eb7d399cece-kube-api-access-2zxq5\") pod \"heat-engine-84cf4c78d4-lld2l\" (UID: \"4569488e-f272-42ff-a887-5eb7d399cece\") " pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.574613 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-internal-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.575130 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data-custom\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.575231 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n62w\" (UniqueName: \"kubernetes.io/projected/47d6eba1-db73-4357-86a6-c15e562f20bd-kube-api-access-6n62w\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.575343 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-public-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.575582 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.575668 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-combined-ca-bundle\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.671285 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678194 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-internal-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678300 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data-custom\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678356 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n62w\" (UniqueName: \"kubernetes.io/projected/47d6eba1-db73-4357-86a6-c15e562f20bd-kube-api-access-6n62w\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678392 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data-custom\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678430 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-public-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678639 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678680 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvpw9\" (UniqueName: \"kubernetes.io/projected/1063826f-dbd1-4b72-b541-dd9832dd788c-kube-api-access-gvpw9\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678731 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678768 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-combined-ca-bundle\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678802 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-internal-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678942 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-public-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.678982 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-combined-ca-bundle\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.685015 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data-custom\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.685995 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-config-data\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.686625 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-public-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.690861 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-internal-tls-certs\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.701317 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d6eba1-db73-4357-86a6-c15e562f20bd-combined-ca-bundle\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.712170 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n62w\" (UniqueName: \"kubernetes.io/projected/47d6eba1-db73-4357-86a6-c15e562f20bd-kube-api-access-6n62w\") pod \"heat-api-6cb8998785-grlzs\" (UID: \"47d6eba1-db73-4357-86a6-c15e562f20bd\") " pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.770320 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.780964 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-public-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.781304 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-combined-ca-bundle\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.781515 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data-custom\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.781792 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.781916 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvpw9\" (UniqueName: \"kubernetes.io/projected/1063826f-dbd1-4b72-b541-dd9832dd788c-kube-api-access-gvpw9\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.782070 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-internal-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.789518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-public-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.789756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-combined-ca-bundle\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.791405 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.793491 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-internal-tls-certs\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.804124 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvpw9\" (UniqueName: \"kubernetes.io/projected/1063826f-dbd1-4b72-b541-dd9832dd788c-kube-api-access-gvpw9\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.807469 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063826f-dbd1-4b72-b541-dd9832dd788c-config-data-custom\") pod \"heat-cfnapi-5d44c98778-8qzdk\" (UID: \"1063826f-dbd1-4b72-b541-dd9832dd788c\") " pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:19 crc kubenswrapper[4912]: I0318 13:32:19.915028 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:20 crc kubenswrapper[4912]: I0318 13:32:20.281732 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 13:32:20 crc kubenswrapper[4912]: I0318 13:32:20.323176 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-84cf4c78d4-lld2l"] Mar 18 13:32:20 crc kubenswrapper[4912]: I0318 13:32:20.445306 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cb8998785-grlzs"] Mar 18 13:32:20 crc kubenswrapper[4912]: I0318 13:32:20.611814 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5d44c98778-8qzdk"] Mar 18 13:32:20 crc kubenswrapper[4912]: W0318 13:32:20.625214 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1063826f_dbd1_4b72_b541_dd9832dd788c.slice/crio-1b9a6a9831eb187ee6daa3e36fa847da58f958828e19d487bb986c9269d807a2 WatchSource:0}: Error finding container 1b9a6a9831eb187ee6daa3e36fa847da58f958828e19d487bb986c9269d807a2: Status 404 returned error can't find the container with id 1b9a6a9831eb187ee6daa3e36fa847da58f958828e19d487bb986c9269d807a2 Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.130708 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" event={"ID":"1063826f-dbd1-4b72-b541-dd9832dd788c","Type":"ContainerStarted","Data":"1b9a6a9831eb187ee6daa3e36fa847da58f958828e19d487bb986c9269d807a2"} Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.135440 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"1e366ffa0c569c413306141c62fa70428165be06b91a7f09bb4263f76a862376"} Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.138439 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-84cf4c78d4-lld2l" event={"ID":"4569488e-f272-42ff-a887-5eb7d399cece","Type":"ContainerStarted","Data":"b0a8cba303e4c4fa6f87ad4e8bb0e1b1cc5887e80a58530809d8b5e22efb6967"} Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.138631 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-84cf4c78d4-lld2l" event={"ID":"4569488e-f272-42ff-a887-5eb7d399cece","Type":"ContainerStarted","Data":"97913505b553875251247c3a3b2d498fd53efacc954458c31d03999f34047516"} Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.139374 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.140837 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cb8998785-grlzs" event={"ID":"47d6eba1-db73-4357-86a6-c15e562f20bd","Type":"ContainerStarted","Data":"d9418ed9849cb0b699fba7eb61975a6bba5868f1c73d9e7c5d8f2723edeafe92"} Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.167292 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.355529187 podStartE2EDuration="44.167273099s" podCreationTimestamp="2026-03-18 13:31:37 +0000 UTC" firstStartedPulling="2026-03-18 13:31:38.680061018 +0000 UTC m=+1747.139488443" lastFinishedPulling="2026-03-18 13:32:20.49180493 +0000 UTC m=+1788.951232355" observedRunningTime="2026-03-18 13:32:21.164993418 +0000 UTC m=+1789.624420843" watchObservedRunningTime="2026-03-18 13:32:21.167273099 +0000 UTC m=+1789.626700524" Mar 18 13:32:21 crc kubenswrapper[4912]: I0318 13:32:21.206119 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-84cf4c78d4-lld2l" podStartSLOduration=2.206087252 podStartE2EDuration="2.206087252s" podCreationTimestamp="2026-03-18 13:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:21.186360363 +0000 UTC m=+1789.645787788" watchObservedRunningTime="2026-03-18 13:32:21.206087252 +0000 UTC m=+1789.665514687" Mar 18 13:32:22 crc kubenswrapper[4912]: I0318 13:32:22.346303 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-8qfhh" Mar 18 13:32:22 crc kubenswrapper[4912]: I0318 13:32:22.459690 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:32:22 crc kubenswrapper[4912]: I0318 13:32:22.460633 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="dnsmasq-dns" containerID="cri-o://f7f394c1de75cbbea306b8ac50934c7e3e0c860a8d50ade29ee18dc5ae82b316" gracePeriod=10 Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.222153 4912 generic.go:334] "Generic (PLEG): container finished" podID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerID="f7f394c1de75cbbea306b8ac50934c7e3e0c860a8d50ade29ee18dc5ae82b316" exitCode=0 Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.222260 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" event={"ID":"f3728348-e4ca-4584-b717-c8c7a506c8c6","Type":"ContainerDied","Data":"f7f394c1de75cbbea306b8ac50934c7e3e0c860a8d50ade29ee18dc5ae82b316"} Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.564808 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737073 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737181 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737357 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnnf7\" (UniqueName: \"kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737415 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737466 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737634 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.737674 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc\") pod \"f3728348-e4ca-4584-b717-c8c7a506c8c6\" (UID: \"f3728348-e4ca-4584-b717-c8c7a506c8c6\") " Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.759497 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7" (OuterVolumeSpecName: "kube-api-access-wnnf7") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "kube-api-access-wnnf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.820412 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.828494 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config" (OuterVolumeSpecName: "config") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.836840 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.841784 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.842165 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.842211 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.842224 4912 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.842239 4912 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.842253 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnnf7\" (UniqueName: \"kubernetes.io/projected/f3728348-e4ca-4584-b717-c8c7a506c8c6-kube-api-access-wnnf7\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.843180 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.855857 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f3728348-e4ca-4584-b717-c8c7a506c8c6" (UID: "f3728348-e4ca-4584-b717-c8c7a506c8c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.945355 4912 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4912]: I0318 13:32:23.945390 4912 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3728348-e4ca-4584-b717-c8c7a506c8c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.246938 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.279455 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-9ftqr" event={"ID":"f3728348-e4ca-4584-b717-c8c7a506c8c6","Type":"ContainerDied","Data":"d407b3b45d558387c7bb74e9f234efc100d28022e85b9e8aa79a4696c600719a"} Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.279534 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.279558 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cb8998785-grlzs" event={"ID":"47d6eba1-db73-4357-86a6-c15e562f20bd","Type":"ContainerStarted","Data":"32102f4e00742de6de97080a270abfd3ec2b2fed9b57373290599f8bf6f98c99"} Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.279572 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" event={"ID":"1063826f-dbd1-4b72-b541-dd9832dd788c","Type":"ContainerStarted","Data":"620a00810888feb88ff17e90933c133cafdd20fa94e48c7a75468fa4b031fa10"} Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.279593 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.280488 4912 scope.go:117] "RemoveContainer" containerID="f7f394c1de75cbbea306b8ac50934c7e3e0c860a8d50ade29ee18dc5ae82b316" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.309976 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6cb8998785-grlzs" podStartSLOduration=2.702479358 podStartE2EDuration="5.309941956s" podCreationTimestamp="2026-03-18 13:32:19 +0000 UTC" firstStartedPulling="2026-03-18 13:32:20.47728642 +0000 UTC m=+1788.936713845" lastFinishedPulling="2026-03-18 13:32:23.084749018 +0000 UTC m=+1791.544176443" observedRunningTime="2026-03-18 13:32:24.28853262 +0000 UTC m=+1792.747960045" watchObservedRunningTime="2026-03-18 13:32:24.309941956 +0000 UTC m=+1792.769369381" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.333027 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" podStartSLOduration=2.872365216 podStartE2EDuration="5.332998536s" podCreationTimestamp="2026-03-18 13:32:19 +0000 UTC" firstStartedPulling="2026-03-18 13:32:20.62829376 +0000 UTC m=+1789.087721185" lastFinishedPulling="2026-03-18 13:32:23.08892707 +0000 UTC m=+1791.548354505" observedRunningTime="2026-03-18 13:32:24.321153547 +0000 UTC m=+1792.780580982" watchObservedRunningTime="2026-03-18 13:32:24.332998536 +0000 UTC m=+1792.792425961" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.352382 4912 scope.go:117] "RemoveContainer" containerID="7086ca245df10630b0e0ea50364a00edb3e8124c5b6b4be2b43e47289ef658cc" Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.378754 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:32:24 crc kubenswrapper[4912]: I0318 13:32:24.447000 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-9ftqr"] Mar 18 13:32:26 crc kubenswrapper[4912]: I0318 13:32:26.247997 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" path="/var/lib/kubelet/pods/f3728348-e4ca-4584-b717-c8c7a506c8c6/volumes" Mar 18 13:32:30 crc kubenswrapper[4912]: I0318 13:32:30.234657 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:32:30 crc kubenswrapper[4912]: E0318 13:32:30.235677 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:31 crc kubenswrapper[4912]: I0318 13:32:31.464495 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6cb8998785-grlzs" Mar 18 13:32:31 crc kubenswrapper[4912]: I0318 13:32:31.562310 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:32:31 crc kubenswrapper[4912]: I0318 13:32:31.562642 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7b4b74494b-wgmr2" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerName="heat-api" containerID="cri-o://51c4a7ce9f97958dc39024e55b9c6f3c087a3e537706719be3968d3a3420fe9b" gracePeriod=60 Mar 18 13:32:32 crc kubenswrapper[4912]: I0318 13:32:32.176009 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5d44c98778-8qzdk" Mar 18 13:32:32 crc kubenswrapper[4912]: I0318 13:32:32.277681 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:32:32 crc kubenswrapper[4912]: I0318 13:32:32.278366 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" containerName="heat-cfnapi" containerID="cri-o://578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a" gracePeriod=60 Mar 18 13:32:34 crc kubenswrapper[4912]: I0318 13:32:34.852127 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7b4b74494b-wgmr2" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.234:8004/healthcheck\": read tcp 10.217.0.2:55426->10.217.0.234:8004: read: connection reset by peer" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.434838 4912 generic.go:334] "Generic (PLEG): container finished" podID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerID="51c4a7ce9f97958dc39024e55b9c6f3c087a3e537706719be3968d3a3420fe9b" exitCode=0 Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.435122 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4b74494b-wgmr2" event={"ID":"47d7c152-dc8f-4406-acbf-e5af6902d651","Type":"ContainerDied","Data":"51c4a7ce9f97958dc39024e55b9c6f3c087a3e537706719be3968d3a3420fe9b"} Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.435663 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4b74494b-wgmr2" event={"ID":"47d7c152-dc8f-4406-acbf-e5af6902d651","Type":"ContainerDied","Data":"23e422b622c8513636bc38048544521e2c7e9b6d6b3a0c8f975c22dd57ca09cd"} Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.435692 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e422b622c8513636bc38048544521e2c7e9b6d6b3a0c8f975c22dd57ca09cd" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.471448 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.500222 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.233:8000/healthcheck\": read tcp 10.217.0.2:41508->10.217.0.233:8000: read: connection reset by peer" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.522974 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.523032 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.523069 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.523173 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.523202 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.523441 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lws6r\" (UniqueName: \"kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r\") pod \"47d7c152-dc8f-4406-acbf-e5af6902d651\" (UID: \"47d7c152-dc8f-4406-acbf-e5af6902d651\") " Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.532309 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r" (OuterVolumeSpecName: "kube-api-access-lws6r") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "kube-api-access-lws6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.566421 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.585229 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.627239 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.627282 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.627296 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lws6r\" (UniqueName: \"kubernetes.io/projected/47d7c152-dc8f-4406-acbf-e5af6902d651-kube-api-access-lws6r\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.637486 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.638792 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data" (OuterVolumeSpecName: "config-data") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.654358 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47d7c152-dc8f-4406-acbf-e5af6902d651" (UID: "47d7c152-dc8f-4406-acbf-e5af6902d651"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.734117 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.734159 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:35 crc kubenswrapper[4912]: I0318 13:32:35.734169 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d7c152-dc8f-4406-acbf-e5af6902d651-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.135824 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254095 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254228 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254414 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254482 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254504 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.254607 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lc7t\" (UniqueName: \"kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t\") pod \"02cb4335-ba8d-434d-b6fe-047b87453890\" (UID: \"02cb4335-ba8d-434d-b6fe-047b87453890\") " Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.266458 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.279521 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t" (OuterVolumeSpecName: "kube-api-access-2lc7t") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "kube-api-access-2lc7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.300415 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.337197 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.357535 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data" (OuterVolumeSpecName: "config-data") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358368 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lc7t\" (UniqueName: \"kubernetes.io/projected/02cb4335-ba8d-434d-b6fe-047b87453890-kube-api-access-2lc7t\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358401 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358412 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358421 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358432 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.358525 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02cb4335-ba8d-434d-b6fe-047b87453890" (UID: "02cb4335-ba8d-434d-b6fe-047b87453890"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.456411 4912 generic.go:334] "Generic (PLEG): container finished" podID="a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0" containerID="138b40e80d75cab54c3610ca9e61e4f98c098b246479fa737f8af03c998fbad5" exitCode=0 Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.456528 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0","Type":"ContainerDied","Data":"138b40e80d75cab54c3610ca9e61e4f98c098b246479fa737f8af03c998fbad5"} Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.461486 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cb4335-ba8d-434d-b6fe-047b87453890-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.467374 4912 generic.go:334] "Generic (PLEG): container finished" podID="02cb4335-ba8d-434d-b6fe-047b87453890" containerID="578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a" exitCode=0 Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.467532 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.468911 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" event={"ID":"02cb4335-ba8d-434d-b6fe-047b87453890","Type":"ContainerDied","Data":"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a"} Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.469078 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bbd4596f-wxfhz" event={"ID":"02cb4335-ba8d-434d-b6fe-047b87453890","Type":"ContainerDied","Data":"a10b010ed7d1d7a5ca046e5f19594738611f55cca9ddd18a34cdba06720b023d"} Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.469136 4912 scope.go:117] "RemoveContainer" containerID="578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.473223 4912 generic.go:334] "Generic (PLEG): container finished" podID="7b0b7b32-0583-4813-b9fd-9697bf4e9d05" containerID="830b838f3d0f8dea544bd4482f9b7c50c80722c238097cc37abe82374c790066" exitCode=0 Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.473337 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4b74494b-wgmr2" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.473468 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7b0b7b32-0583-4813-b9fd-9697bf4e9d05","Type":"ContainerDied","Data":"830b838f3d0f8dea544bd4482f9b7c50c80722c238097cc37abe82374c790066"} Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.571273 4912 scope.go:117] "RemoveContainer" containerID="578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a" Mar 18 13:32:36 crc kubenswrapper[4912]: E0318 13:32:36.571887 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a\": container with ID starting with 578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a not found: ID does not exist" containerID="578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.571988 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a"} err="failed to get container status \"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a\": rpc error: code = NotFound desc = could not find container \"578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a\": container with ID starting with 578c0656f327e9b18d6024a856a03ccec91b1ecbb74001f9ef0af26c9fd2192a not found: ID does not exist" Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.632266 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.647905 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7b4b74494b-wgmr2"] Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.664697 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:32:36 crc kubenswrapper[4912]: I0318 13:32:36.682827 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76bbd4596f-wxfhz"] Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.490516 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0","Type":"ContainerStarted","Data":"ad7d5fe3ee498eb01c1296fd06a3ac8b90240b85d29063b2294aecb0720729a6"} Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.491996 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.498368 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7b0b7b32-0583-4813-b9fd-9697bf4e9d05","Type":"ContainerStarted","Data":"42a86a80c104953573cb12b329eb0a413daf94244f1a30828bc2b65fa71f84e3"} Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.499477 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.528914 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=46.528875969 podStartE2EDuration="46.528875969s" podCreationTimestamp="2026-03-18 13:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:37.526764702 +0000 UTC m=+1805.986192137" watchObservedRunningTime="2026-03-18 13:32:37.528875969 +0000 UTC m=+1805.988303394" Mar 18 13:32:37 crc kubenswrapper[4912]: I0318 13:32:37.589196 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.589164359 podStartE2EDuration="40.589164359s" podCreationTimestamp="2026-03-18 13:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:37.574627128 +0000 UTC m=+1806.034054753" watchObservedRunningTime="2026-03-18 13:32:37.589164359 +0000 UTC m=+1806.048591784" Mar 18 13:32:38 crc kubenswrapper[4912]: I0318 13:32:38.246873 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" path="/var/lib/kubelet/pods/02cb4335-ba8d-434d-b6fe-047b87453890/volumes" Mar 18 13:32:38 crc kubenswrapper[4912]: I0318 13:32:38.248122 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" path="/var/lib/kubelet/pods/47d7c152-dc8f-4406-acbf-e5af6902d651/volumes" Mar 18 13:32:39 crc kubenswrapper[4912]: I0318 13:32:39.774997 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-84cf4c78d4-lld2l" Mar 18 13:32:39 crc kubenswrapper[4912]: I0318 13:32:39.838290 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:32:39 crc kubenswrapper[4912]: I0318 13:32:39.838632 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-684f5dccdc-5k4b9" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" containerID="cri-o://2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" gracePeriod=60 Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.228640 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:32:41 crc kubenswrapper[4912]: E0318 13:32:41.229498 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.725561 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v"] Mar 18 13:32:41 crc kubenswrapper[4912]: E0318 13:32:41.726199 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="init" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726217 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="init" Mar 18 13:32:41 crc kubenswrapper[4912]: E0318 13:32:41.726261 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="dnsmasq-dns" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726267 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="dnsmasq-dns" Mar 18 13:32:41 crc kubenswrapper[4912]: E0318 13:32:41.726291 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerName="heat-api" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726298 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerName="heat-api" Mar 18 13:32:41 crc kubenswrapper[4912]: E0318 13:32:41.726311 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" containerName="heat-cfnapi" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726317 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" containerName="heat-cfnapi" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726559 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cb4335-ba8d-434d-b6fe-047b87453890" containerName="heat-cfnapi" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726571 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d7c152-dc8f-4406-acbf-e5af6902d651" containerName="heat-api" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.726585 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3728348-e4ca-4584-b717-c8c7a506c8c6" containerName="dnsmasq-dns" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.727616 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.735592 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.735832 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.735981 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.736131 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.749234 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v"] Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.824248 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sr4q\" (UniqueName: \"kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.824468 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.824535 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.824830 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.928168 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.928306 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.928456 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sr4q\" (UniqueName: \"kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.928625 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.936386 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.941964 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.957451 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sr4q\" (UniqueName: \"kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:41 crc kubenswrapper[4912]: I0318 13:32:41.957907 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.056371 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.766219 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-nk4hg"] Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.793494 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-nk4hg"] Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.948490 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-qgkzn"] Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.966873 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qgkzn"] Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.967086 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:42 crc kubenswrapper[4912]: I0318 13:32:42.971715 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.071158 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkr2g\" (UniqueName: \"kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.071300 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.071413 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.071450 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.174272 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkr2g\" (UniqueName: \"kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.174342 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.174395 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.174447 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.183780 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.186764 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.188946 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.195637 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkr2g\" (UniqueName: \"kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g\") pod \"aodh-db-sync-qgkzn\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.300187 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.889330 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qgkzn"] Mar 18 13:32:43 crc kubenswrapper[4912]: I0318 13:32:43.906842 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v"] Mar 18 13:32:43 crc kubenswrapper[4912]: W0318 13:32:43.923704 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod880a8bf3_227a_44a5_89ef_ee032d977775.slice/crio-6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7 WatchSource:0}: Error finding container 6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7: Status 404 returned error can't find the container with id 6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7 Mar 18 13:32:44 crc kubenswrapper[4912]: I0318 13:32:44.303482 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a703388-f5f1-4975-9c2c-5ac152798930" path="/var/lib/kubelet/pods/2a703388-f5f1-4975-9c2c-5ac152798930/volumes" Mar 18 13:32:44 crc kubenswrapper[4912]: I0318 13:32:44.623835 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" event={"ID":"edb56b41-20f0-40db-925f-fb26ec712461","Type":"ContainerStarted","Data":"46e2ab64bc568eed4bb3b561b368ea3d2596c92cfd11ce6b57b1aa06cb557f2d"} Mar 18 13:32:44 crc kubenswrapper[4912]: I0318 13:32:44.626463 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgkzn" event={"ID":"880a8bf3-227a-44a5-89ef-ee032d977775","Type":"ContainerStarted","Data":"6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7"} Mar 18 13:32:45 crc kubenswrapper[4912]: E0318 13:32:45.039253 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:45 crc kubenswrapper[4912]: E0318 13:32:45.080634 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:45 crc kubenswrapper[4912]: E0318 13:32:45.089618 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:45 crc kubenswrapper[4912]: E0318 13:32:45.089737 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-684f5dccdc-5k4b9" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" Mar 18 13:32:48 crc kubenswrapper[4912]: I0318 13:32:48.558526 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="7b0b7b32-0583-4813-b9fd-9697bf4e9d05" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.24:5671: connect: connection refused" Mar 18 13:32:50 crc kubenswrapper[4912]: I0318 13:32:50.552774 4912 scope.go:117] "RemoveContainer" containerID="1d2a166924f5b4f3e488f2ea7f00fb05c68b56c294cea47dc73fb81918b8845b" Mar 18 13:32:52 crc kubenswrapper[4912]: I0318 13:32:52.245082 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:32:52 crc kubenswrapper[4912]: E0318 13:32:52.246413 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:32:52 crc kubenswrapper[4912]: I0318 13:32:52.275094 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.23:5671: connect: connection refused" Mar 18 13:32:52 crc kubenswrapper[4912]: I0318 13:32:52.774333 4912 generic.go:334] "Generic (PLEG): container finished" podID="63512997-1801-4665-9f60-91d912dc57e8" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" exitCode=0 Mar 18 13:32:52 crc kubenswrapper[4912]: I0318 13:32:52.774426 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684f5dccdc-5k4b9" event={"ID":"63512997-1801-4665-9f60-91d912dc57e8","Type":"ContainerDied","Data":"2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21"} Mar 18 13:32:55 crc kubenswrapper[4912]: E0318 13:32:55.025168 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21 is running failed: container process not found" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:55 crc kubenswrapper[4912]: E0318 13:32:55.025943 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21 is running failed: container process not found" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:55 crc kubenswrapper[4912]: E0318 13:32:55.026305 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21 is running failed: container process not found" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 13:32:55 crc kubenswrapper[4912]: E0318 13:32:55.026375 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-684f5dccdc-5k4b9" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" Mar 18 13:32:57 crc kubenswrapper[4912]: I0318 13:32:57.651272 4912 scope.go:117] "RemoveContainer" containerID="721e891730d0796f3d304ba629af0751a53659d4ba1dab881f94538eb2e1c84e" Mar 18 13:32:57 crc kubenswrapper[4912]: I0318 13:32:57.986708 4912 scope.go:117] "RemoveContainer" containerID="f5801bb0293739b2976ca898be44b5d6224ac06ff96f49158776c37517b87d88" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.049985 4912 scope.go:117] "RemoveContainer" containerID="42e0127644f532b348025f7df7619a404bc946dab2301b65068c037fe8bc9f8d" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.086557 4912 scope.go:117] "RemoveContainer" containerID="454531e67e48c62d206f8691ac34b2f156047af5873d1a3c1c66859c808ee5e9" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.185600 4912 scope.go:117] "RemoveContainer" containerID="b15d4fb8c42327da3fbd061f47dd161f5f210106735fd6e8f75cc8b69d650bee" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.189393 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.234130 4912 scope.go:117] "RemoveContainer" containerID="8cb27bbe63cf586b29efbafbd56bc38226c8ccf81ec48f1442ed99ae56a48117" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.302155 4912 scope.go:117] "RemoveContainer" containerID="fea833f54f21a2c58861ffc97de67f6c570ec8fd3429ab4275e58e62000dc932" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.304756 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qssx8\" (UniqueName: \"kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8\") pod \"63512997-1801-4665-9f60-91d912dc57e8\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.304924 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data\") pod \"63512997-1801-4665-9f60-91d912dc57e8\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.305115 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom\") pod \"63512997-1801-4665-9f60-91d912dc57e8\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.305161 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle\") pod \"63512997-1801-4665-9f60-91d912dc57e8\" (UID: \"63512997-1801-4665-9f60-91d912dc57e8\") " Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.311837 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63512997-1801-4665-9f60-91d912dc57e8" (UID: "63512997-1801-4665-9f60-91d912dc57e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.312311 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8" (OuterVolumeSpecName: "kube-api-access-qssx8") pod "63512997-1801-4665-9f60-91d912dc57e8" (UID: "63512997-1801-4665-9f60-91d912dc57e8"). InnerVolumeSpecName "kube-api-access-qssx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.332478 4912 scope.go:117] "RemoveContainer" containerID="00eca2aec65f34cd0d48bc6e825d46de32abceac203177a231daac5d742dfb97" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.340270 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63512997-1801-4665-9f60-91d912dc57e8" (UID: "63512997-1801-4665-9f60-91d912dc57e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.379388 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data" (OuterVolumeSpecName: "config-data") pod "63512997-1801-4665-9f60-91d912dc57e8" (UID: "63512997-1801-4665-9f60-91d912dc57e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.409075 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qssx8\" (UniqueName: \"kubernetes.io/projected/63512997-1801-4665-9f60-91d912dc57e8-kube-api-access-qssx8\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.409156 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.409176 4912 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.409188 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63512997-1801-4665-9f60-91d912dc57e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.557447 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.895487 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" event={"ID":"edb56b41-20f0-40db-925f-fb26ec712461","Type":"ContainerStarted","Data":"ba525c0763e2325c025a37caa5ecc1f2fcfb715180ce82cf78ef857fe6e3568d"} Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.898264 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgkzn" event={"ID":"880a8bf3-227a-44a5-89ef-ee032d977775","Type":"ContainerStarted","Data":"f2978ffac8f239da1dfd31725e618d0fe53b53fa43c2c2886a7f1ffd2540d727"} Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.900943 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684f5dccdc-5k4b9" event={"ID":"63512997-1801-4665-9f60-91d912dc57e8","Type":"ContainerDied","Data":"6d85c67e67fe1e8c98a908e4516ac993d252357f55fb8cb607597eab37d148ef"} Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.900979 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684f5dccdc-5k4b9" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.901013 4912 scope.go:117] "RemoveContainer" containerID="2bb1323d51cc0a3bcb4236a9531980cb8725a850f055c29b3cc91ee7b9344c21" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.925526 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" podStartSLOduration=4.182046684 podStartE2EDuration="17.925498078s" podCreationTimestamp="2026-03-18 13:32:41 +0000 UTC" firstStartedPulling="2026-03-18 13:32:43.914450346 +0000 UTC m=+1812.373877771" lastFinishedPulling="2026-03-18 13:32:57.65790174 +0000 UTC m=+1826.117329165" observedRunningTime="2026-03-18 13:32:58.915256623 +0000 UTC m=+1827.374684078" watchObservedRunningTime="2026-03-18 13:32:58.925498078 +0000 UTC m=+1827.384925503" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.948955 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-qgkzn" podStartSLOduration=3.225206393 podStartE2EDuration="16.948915987s" podCreationTimestamp="2026-03-18 13:32:42 +0000 UTC" firstStartedPulling="2026-03-18 13:32:43.929612303 +0000 UTC m=+1812.389039728" lastFinishedPulling="2026-03-18 13:32:57.653321897 +0000 UTC m=+1826.112749322" observedRunningTime="2026-03-18 13:32:58.94305904 +0000 UTC m=+1827.402486465" watchObservedRunningTime="2026-03-18 13:32:58.948915987 +0000 UTC m=+1827.408343412" Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.980338 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:32:58 crc kubenswrapper[4912]: I0318 13:32:58.997015 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-684f5dccdc-5k4b9"] Mar 18 13:33:00 crc kubenswrapper[4912]: I0318 13:33:00.249879 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63512997-1801-4665-9f60-91d912dc57e8" path="/var/lib/kubelet/pods/63512997-1801-4665-9f60-91d912dc57e8/volumes" Mar 18 13:33:01 crc kubenswrapper[4912]: I0318 13:33:01.947197 4912 generic.go:334] "Generic (PLEG): container finished" podID="880a8bf3-227a-44a5-89ef-ee032d977775" containerID="f2978ffac8f239da1dfd31725e618d0fe53b53fa43c2c2886a7f1ffd2540d727" exitCode=0 Mar 18 13:33:01 crc kubenswrapper[4912]: I0318 13:33:01.947375 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgkzn" event={"ID":"880a8bf3-227a-44a5-89ef-ee032d977775","Type":"ContainerDied","Data":"f2978ffac8f239da1dfd31725e618d0fe53b53fa43c2c2886a7f1ffd2540d727"} Mar 18 13:33:02 crc kubenswrapper[4912]: I0318 13:33:02.273352 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 18 13:33:02 crc kubenswrapper[4912]: I0318 13:33:02.353982 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.414906 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.486357 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data\") pod \"880a8bf3-227a-44a5-89ef-ee032d977775\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.486431 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkr2g\" (UniqueName: \"kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g\") pod \"880a8bf3-227a-44a5-89ef-ee032d977775\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.486610 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle\") pod \"880a8bf3-227a-44a5-89ef-ee032d977775\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.486707 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts\") pod \"880a8bf3-227a-44a5-89ef-ee032d977775\" (UID: \"880a8bf3-227a-44a5-89ef-ee032d977775\") " Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.500916 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g" (OuterVolumeSpecName: "kube-api-access-lkr2g") pod "880a8bf3-227a-44a5-89ef-ee032d977775" (UID: "880a8bf3-227a-44a5-89ef-ee032d977775"). InnerVolumeSpecName "kube-api-access-lkr2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.503001 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts" (OuterVolumeSpecName: "scripts") pod "880a8bf3-227a-44a5-89ef-ee032d977775" (UID: "880a8bf3-227a-44a5-89ef-ee032d977775"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.544635 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data" (OuterVolumeSpecName: "config-data") pod "880a8bf3-227a-44a5-89ef-ee032d977775" (UID: "880a8bf3-227a-44a5-89ef-ee032d977775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.573390 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "880a8bf3-227a-44a5-89ef-ee032d977775" (UID: "880a8bf3-227a-44a5-89ef-ee032d977775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.594123 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.594180 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.594199 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkr2g\" (UniqueName: \"kubernetes.io/projected/880a8bf3-227a-44a5-89ef-ee032d977775-kube-api-access-lkr2g\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.594216 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a8bf3-227a-44a5-89ef-ee032d977775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.980145 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgkzn" event={"ID":"880a8bf3-227a-44a5-89ef-ee032d977775","Type":"ContainerDied","Data":"6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7"} Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.980565 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa80c44a738d81bd1f3d022920bc6bdef9eb4d118976eb170d481ef32fc6cc7" Mar 18 13:33:03 crc kubenswrapper[4912]: I0318 13:33:03.980240 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgkzn" Mar 18 13:33:06 crc kubenswrapper[4912]: I0318 13:33:06.229051 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:33:06 crc kubenswrapper[4912]: E0318 13:33:06.229816 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:33:06 crc kubenswrapper[4912]: I0318 13:33:06.740216 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="rabbitmq" containerID="cri-o://8d7e357063efa4771949307841a428e486507bdf5be05cfb41589e927c52092d" gracePeriod=604796 Mar 18 13:33:07 crc kubenswrapper[4912]: I0318 13:33:07.991103 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:07 crc kubenswrapper[4912]: I0318 13:33:07.991943 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-api" containerID="cri-o://779ca3a11a650e6517547026d581476363ff0160365bb2d2c42a48d30d164d8c" gracePeriod=30 Mar 18 13:33:07 crc kubenswrapper[4912]: I0318 13:33:07.992076 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-listener" containerID="cri-o://0b392ef47e5ad9bb3905e165cf3e5542e2b74e1b2d39bd3d7175b57d598f80c4" gracePeriod=30 Mar 18 13:33:07 crc kubenswrapper[4912]: I0318 13:33:07.992962 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-notifier" containerID="cri-o://d92c978d0ab1bdaabec7ab01580be2f34b40cfbc7b1e94e08159cd9fbc124643" gracePeriod=30 Mar 18 13:33:07 crc kubenswrapper[4912]: I0318 13:33:07.993059 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-evaluator" containerID="cri-o://0c731e46d6f4d6e2d17f95dc8b4ad1074f341a6f67781194e648ef8922c76edc" gracePeriod=30 Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.072836 4912 generic.go:334] "Generic (PLEG): container finished" podID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerID="0c731e46d6f4d6e2d17f95dc8b4ad1074f341a6f67781194e648ef8922c76edc" exitCode=0 Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.072877 4912 generic.go:334] "Generic (PLEG): container finished" podID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerID="779ca3a11a650e6517547026d581476363ff0160365bb2d2c42a48d30d164d8c" exitCode=0 Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.072931 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerDied","Data":"0c731e46d6f4d6e2d17f95dc8b4ad1074f341a6f67781194e648ef8922c76edc"} Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.072991 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerDied","Data":"779ca3a11a650e6517547026d581476363ff0160365bb2d2c42a48d30d164d8c"} Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.074534 4912 generic.go:334] "Generic (PLEG): container finished" podID="edb56b41-20f0-40db-925f-fb26ec712461" containerID="ba525c0763e2325c025a37caa5ecc1f2fcfb715180ce82cf78ef857fe6e3568d" exitCode=0 Mar 18 13:33:09 crc kubenswrapper[4912]: I0318 13:33:09.074557 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" event={"ID":"edb56b41-20f0-40db-925f-fb26ec712461","Type":"ContainerDied","Data":"ba525c0763e2325c025a37caa5ecc1f2fcfb715180ce82cf78ef857fe6e3568d"} Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.671391 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.731595 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory\") pod \"edb56b41-20f0-40db-925f-fb26ec712461\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.731846 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sr4q\" (UniqueName: \"kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q\") pod \"edb56b41-20f0-40db-925f-fb26ec712461\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.731888 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle\") pod \"edb56b41-20f0-40db-925f-fb26ec712461\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.732059 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam\") pod \"edb56b41-20f0-40db-925f-fb26ec712461\" (UID: \"edb56b41-20f0-40db-925f-fb26ec712461\") " Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.748820 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "edb56b41-20f0-40db-925f-fb26ec712461" (UID: "edb56b41-20f0-40db-925f-fb26ec712461"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.748923 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q" (OuterVolumeSpecName: "kube-api-access-8sr4q") pod "edb56b41-20f0-40db-925f-fb26ec712461" (UID: "edb56b41-20f0-40db-925f-fb26ec712461"). InnerVolumeSpecName "kube-api-access-8sr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.778179 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory" (OuterVolumeSpecName: "inventory") pod "edb56b41-20f0-40db-925f-fb26ec712461" (UID: "edb56b41-20f0-40db-925f-fb26ec712461"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.778284 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "edb56b41-20f0-40db-925f-fb26ec712461" (UID: "edb56b41-20f0-40db-925f-fb26ec712461"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.843291 4912 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.843337 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.843349 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edb56b41-20f0-40db-925f-fb26ec712461-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:10 crc kubenswrapper[4912]: I0318 13:33:10.843361 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sr4q\" (UniqueName: \"kubernetes.io/projected/edb56b41-20f0-40db-925f-fb26ec712461-kube-api-access-8sr4q\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.103472 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" event={"ID":"edb56b41-20f0-40db-925f-fb26ec712461","Type":"ContainerDied","Data":"46e2ab64bc568eed4bb3b561b368ea3d2596c92cfd11ce6b57b1aa06cb557f2d"} Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.103931 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e2ab64bc568eed4bb3b561b368ea3d2596c92cfd11ce6b57b1aa06cb557f2d" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.103537 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.206029 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp"] Mar 18 13:33:11 crc kubenswrapper[4912]: E0318 13:33:11.206766 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880a8bf3-227a-44a5-89ef-ee032d977775" containerName="aodh-db-sync" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.206789 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="880a8bf3-227a-44a5-89ef-ee032d977775" containerName="aodh-db-sync" Mar 18 13:33:11 crc kubenswrapper[4912]: E0318 13:33:11.206808 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb56b41-20f0-40db-925f-fb26ec712461" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.206817 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb56b41-20f0-40db-925f-fb26ec712461" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:11 crc kubenswrapper[4912]: E0318 13:33:11.206849 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.206856 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.207230 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="880a8bf3-227a-44a5-89ef-ee032d977775" containerName="aodh-db-sync" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.207260 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb56b41-20f0-40db-925f-fb26ec712461" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.207275 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="63512997-1801-4665-9f60-91d912dc57e8" containerName="heat-engine" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.208384 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.211668 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.211685 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.211736 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.212140 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.231572 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp"] Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.254457 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj67w\" (UniqueName: \"kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.254612 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.254672 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.357930 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj67w\" (UniqueName: \"kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.358092 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.358139 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.363616 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.365401 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.378990 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj67w\" (UniqueName: \"kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qwgqp\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:11 crc kubenswrapper[4912]: I0318 13:33:11.541455 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:12 crc kubenswrapper[4912]: I0318 13:33:12.111212 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp"] Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.149335 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" event={"ID":"0e7cc04c-de03-4e24-b041-663be152ac0e","Type":"ContainerStarted","Data":"b17cb46ce7a369a5bf2beefbf837f492ca45fe13d41f9218e2f061968862247d"} Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.156871 4912 generic.go:334] "Generic (PLEG): container finished" podID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerID="d92c978d0ab1bdaabec7ab01580be2f34b40cfbc7b1e94e08159cd9fbc124643" exitCode=0 Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.156963 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerDied","Data":"d92c978d0ab1bdaabec7ab01580be2f34b40cfbc7b1e94e08159cd9fbc124643"} Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.163233 4912 generic.go:334] "Generic (PLEG): container finished" podID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerID="8d7e357063efa4771949307841a428e486507bdf5be05cfb41589e927c52092d" exitCode=0 Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.163286 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerDied","Data":"8d7e357063efa4771949307841a428e486507bdf5be05cfb41589e927c52092d"} Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.447202 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535483 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535587 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535642 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535757 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535791 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535845 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76x4z\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.535956 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.536095 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.537280 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.540241 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.544526 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z" (OuterVolumeSpecName: "kube-api-access-76x4z") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "kube-api-access-76x4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.544684 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.545192 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.545322 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.545367 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info\") pod \"97b795fb-bb07-4401-8e18-0b826303b4ba\" (UID: \"97b795fb-bb07-4401-8e18-0b826303b4ba\") " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.545201 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.546590 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547137 4912 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547217 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547316 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547405 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547491 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76x4z\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-kube-api-access-76x4z\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.547558 4912 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97b795fb-bb07-4401-8e18-0b826303b4ba-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.551328 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info" (OuterVolumeSpecName: "pod-info") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.591528 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929" (OuterVolumeSpecName: "persistence") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.601194 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data" (OuterVolumeSpecName: "config-data") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.620308 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf" (OuterVolumeSpecName: "server-conf") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.653802 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.656103 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") on node \"crc\" " Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.656134 4912 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97b795fb-bb07-4401-8e18-0b826303b4ba-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.656147 4912 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97b795fb-bb07-4401-8e18-0b826303b4ba-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.705755 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.705977 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929") on node "crc" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.722509 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "97b795fb-bb07-4401-8e18-0b826303b4ba" (UID: "97b795fb-bb07-4401-8e18-0b826303b4ba"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.763063 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97b795fb-bb07-4401-8e18-0b826303b4ba-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:13 crc kubenswrapper[4912]: I0318 13:33:13.763103 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.187180 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" event={"ID":"0e7cc04c-de03-4e24-b041-663be152ac0e","Type":"ContainerStarted","Data":"3738d47639c241565ebe0520901f45845011be35fe3c7a12060ccf3554c825c2"} Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.190737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97b795fb-bb07-4401-8e18-0b826303b4ba","Type":"ContainerDied","Data":"0e85baf20994fcef0c477cb3dd9bdfc3baa9ab328d7579b9e6e14f3562cb4d7d"} Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.190805 4912 scope.go:117] "RemoveContainer" containerID="8d7e357063efa4771949307841a428e486507bdf5be05cfb41589e927c52092d" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.191065 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.215665 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" podStartSLOduration=2.492914623 podStartE2EDuration="3.215629602s" podCreationTimestamp="2026-03-18 13:33:11 +0000 UTC" firstStartedPulling="2026-03-18 13:33:12.114009533 +0000 UTC m=+1840.573436958" lastFinishedPulling="2026-03-18 13:33:12.836724512 +0000 UTC m=+1841.296151937" observedRunningTime="2026-03-18 13:33:14.206588859 +0000 UTC m=+1842.666016284" watchObservedRunningTime="2026-03-18 13:33:14.215629602 +0000 UTC m=+1842.675057027" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.220166 4912 scope.go:117] "RemoveContainer" containerID="44d2c20bf933e8a8a31df1480a2142c6fd3cd3bffe7c9d96198e1527e8ce82b2" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.260546 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.271827 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.320872 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:14 crc kubenswrapper[4912]: E0318 13:33:14.321932 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="rabbitmq" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.321965 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="rabbitmq" Mar 18 13:33:14 crc kubenswrapper[4912]: E0318 13:33:14.322223 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="setup-container" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.322247 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="setup-container" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.322589 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" containerName="rabbitmq" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.325626 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.345554 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.394330 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.395624 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db9357ac-df66-4e60-bddf-38d4f8847623-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.395754 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-server-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.395822 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-config-data\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.395900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.395946 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.396078 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.396211 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db9357ac-df66-4e60-bddf-38d4f8847623-pod-info\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.396230 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7jt\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-kube-api-access-sh7jt\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.396450 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.396607 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.499726 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db9357ac-df66-4e60-bddf-38d4f8847623-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.500152 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-server-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.500281 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-config-data\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.500437 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.500565 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.500744 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501101 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501369 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501493 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db9357ac-df66-4e60-bddf-38d4f8847623-pod-info\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501583 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7jt\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-kube-api-access-sh7jt\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501788 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501621 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-server-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.501502 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-config-data\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.502010 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.502462 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.503770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db9357ac-df66-4e60-bddf-38d4f8847623-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.507474 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db9357ac-df66-4e60-bddf-38d4f8847623-pod-info\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.509096 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.514738 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.516241 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.516295 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8eb5ba5613cad0584a090ab94d3d93f59f943bdbf827f567e324c9dfa87263aa/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.527572 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db9357ac-df66-4e60-bddf-38d4f8847623-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.540695 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7jt\" (UniqueName: \"kubernetes.io/projected/db9357ac-df66-4e60-bddf-38d4f8847623-kube-api-access-sh7jt\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.620770 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b0eada7-29a9-43f4-b848-0f3a2cb9b929\") pod \"rabbitmq-server-1\" (UID: \"db9357ac-df66-4e60-bddf-38d4f8847623\") " pod="openstack/rabbitmq-server-1" Mar 18 13:33:14 crc kubenswrapper[4912]: I0318 13:33:14.653939 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 13:33:15 crc kubenswrapper[4912]: I0318 13:33:15.283629 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 13:33:16 crc kubenswrapper[4912]: I0318 13:33:16.219579 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"db9357ac-df66-4e60-bddf-38d4f8847623","Type":"ContainerStarted","Data":"191b0457f10d8245b30790c02f1a40e4d51625b8271d20b2d667e2c28b770ef8"} Mar 18 13:33:16 crc kubenswrapper[4912]: I0318 13:33:16.244230 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b795fb-bb07-4401-8e18-0b826303b4ba" path="/var/lib/kubelet/pods/97b795fb-bb07-4401-8e18-0b826303b4ba/volumes" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.251531 4912 generic.go:334] "Generic (PLEG): container finished" podID="0e7cc04c-de03-4e24-b041-663be152ac0e" containerID="3738d47639c241565ebe0520901f45845011be35fe3c7a12060ccf3554c825c2" exitCode=0 Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.251611 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" event={"ID":"0e7cc04c-de03-4e24-b041-663be152ac0e","Type":"ContainerDied","Data":"3738d47639c241565ebe0520901f45845011be35fe3c7a12060ccf3554c825c2"} Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.256412 4912 generic.go:334] "Generic (PLEG): container finished" podID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerID="0b392ef47e5ad9bb3905e165cf3e5542e2b74e1b2d39bd3d7175b57d598f80c4" exitCode=0 Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.256480 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerDied","Data":"0b392ef47e5ad9bb3905e165cf3e5542e2b74e1b2d39bd3d7175b57d598f80c4"} Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.633905 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.724369 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.724516 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.724779 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.724842 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.724885 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8x7z\" (UniqueName: \"kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.725119 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data\") pod \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\" (UID: \"e37dd0f9-c4f8-4859-a45a-b821c2584b8a\") " Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.740017 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts" (OuterVolumeSpecName: "scripts") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.748896 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z" (OuterVolumeSpecName: "kube-api-access-j8x7z") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "kube-api-access-j8x7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.829167 4912 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.829218 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8x7z\" (UniqueName: \"kubernetes.io/projected/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-kube-api-access-j8x7z\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.832193 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.860533 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.932806 4912 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.932857 4912 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.955597 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:17 crc kubenswrapper[4912]: I0318 13:33:17.970066 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data" (OuterVolumeSpecName: "config-data") pod "e37dd0f9-c4f8-4859-a45a-b821c2584b8a" (UID: "e37dd0f9-c4f8-4859-a45a-b821c2584b8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.035969 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.036005 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37dd0f9-c4f8-4859-a45a-b821c2584b8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.228616 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:33:18 crc kubenswrapper[4912]: E0318 13:33:18.229180 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.273483 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e37dd0f9-c4f8-4859-a45a-b821c2584b8a","Type":"ContainerDied","Data":"62ed403ab2e2ddf594cdd9401c274876b3cc1e35e0ea0ae9bae61772e3d02b06"} Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.273567 4912 scope.go:117] "RemoveContainer" containerID="0b392ef47e5ad9bb3905e165cf3e5542e2b74e1b2d39bd3d7175b57d598f80c4" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.273583 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.277338 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"db9357ac-df66-4e60-bddf-38d4f8847623","Type":"ContainerStarted","Data":"d42dae28db0a37386bc6460f141dcc53b4f42dc060bd57fa0c6a7600b9fbe330"} Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.315968 4912 scope.go:117] "RemoveContainer" containerID="d92c978d0ab1bdaabec7ab01580be2f34b40cfbc7b1e94e08159cd9fbc124643" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.361741 4912 scope.go:117] "RemoveContainer" containerID="0c731e46d6f4d6e2d17f95dc8b4ad1074f341a6f67781194e648ef8922c76edc" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.389441 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.427396 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.443276 4912 scope.go:117] "RemoveContainer" containerID="779ca3a11a650e6517547026d581476363ff0160365bb2d2c42a48d30d164d8c" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.445945 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:18 crc kubenswrapper[4912]: E0318 13:33:18.446692 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-listener" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.446714 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-listener" Mar 18 13:33:18 crc kubenswrapper[4912]: E0318 13:33:18.446747 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-notifier" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.446756 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-notifier" Mar 18 13:33:18 crc kubenswrapper[4912]: E0318 13:33:18.446785 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-api" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.446793 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-api" Mar 18 13:33:18 crc kubenswrapper[4912]: E0318 13:33:18.446806 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-evaluator" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.446814 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-evaluator" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.447317 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-evaluator" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.447362 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-api" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.447385 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-notifier" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.447398 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" containerName="aodh-listener" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.450508 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.454615 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.456490 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.456711 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d7sqs" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.456897 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.457389 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.473675 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.554654 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8nrf\" (UniqueName: \"kubernetes.io/projected/122ef44f-951b-4aa0-bef1-d190f7b5a495-kube-api-access-n8nrf\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.554720 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-combined-ca-bundle\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.554881 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-internal-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.555015 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-scripts\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.555166 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-public-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.555249 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-config-data\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658465 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-config-data\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658561 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8nrf\" (UniqueName: \"kubernetes.io/projected/122ef44f-951b-4aa0-bef1-d190f7b5a495-kube-api-access-n8nrf\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658601 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-combined-ca-bundle\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658740 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-internal-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658808 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-scripts\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.658922 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-public-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.678435 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-combined-ca-bundle\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.678799 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-public-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.681114 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-internal-tls-certs\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.682184 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-scripts\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.685517 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ef44f-951b-4aa0-bef1-d190f7b5a495-config-data\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.701342 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8nrf\" (UniqueName: \"kubernetes.io/projected/122ef44f-951b-4aa0-bef1-d190f7b5a495-kube-api-access-n8nrf\") pod \"aodh-0\" (UID: \"122ef44f-951b-4aa0-bef1-d190f7b5a495\") " pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.773849 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:33:18 crc kubenswrapper[4912]: I0318 13:33:18.957892 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.070053 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj67w\" (UniqueName: \"kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w\") pod \"0e7cc04c-de03-4e24-b041-663be152ac0e\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.070297 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory\") pod \"0e7cc04c-de03-4e24-b041-663be152ac0e\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.070688 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam\") pod \"0e7cc04c-de03-4e24-b041-663be152ac0e\" (UID: \"0e7cc04c-de03-4e24-b041-663be152ac0e\") " Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.085080 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w" (OuterVolumeSpecName: "kube-api-access-xj67w") pod "0e7cc04c-de03-4e24-b041-663be152ac0e" (UID: "0e7cc04c-de03-4e24-b041-663be152ac0e"). InnerVolumeSpecName "kube-api-access-xj67w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.114492 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory" (OuterVolumeSpecName: "inventory") pod "0e7cc04c-de03-4e24-b041-663be152ac0e" (UID: "0e7cc04c-de03-4e24-b041-663be152ac0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.116152 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e7cc04c-de03-4e24-b041-663be152ac0e" (UID: "0e7cc04c-de03-4e24-b041-663be152ac0e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.175835 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.175877 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7cc04c-de03-4e24-b041-663be152ac0e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.175889 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj67w\" (UniqueName: \"kubernetes.io/projected/0e7cc04c-de03-4e24-b041-663be152ac0e-kube-api-access-xj67w\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.298011 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" event={"ID":"0e7cc04c-de03-4e24-b041-663be152ac0e","Type":"ContainerDied","Data":"b17cb46ce7a369a5bf2beefbf837f492ca45fe13d41f9218e2f061968862247d"} Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.299550 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17cb46ce7a369a5bf2beefbf837f492ca45fe13d41f9218e2f061968862247d" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.298295 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qwgqp" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.386599 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.408342 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49"] Mar 18 13:33:19 crc kubenswrapper[4912]: E0318 13:33:19.409615 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7cc04c-de03-4e24-b041-663be152ac0e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.409637 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7cc04c-de03-4e24-b041-663be152ac0e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.410104 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7cc04c-de03-4e24-b041-663be152ac0e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.411384 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.415009 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.415718 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.415973 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.416127 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.423995 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49"] Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.484417 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.484816 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdvj\" (UniqueName: \"kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.484977 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.485122 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.588305 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.588443 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdvj\" (UniqueName: \"kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.588498 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.588541 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.593616 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.594340 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.596518 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.611445 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdvj\" (UniqueName: \"kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:19 crc kubenswrapper[4912]: I0318 13:33:19.780808 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:33:21 crc kubenswrapper[4912]: I0318 13:33:20.292599 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37dd0f9-c4f8-4859-a45a-b821c2584b8a" path="/var/lib/kubelet/pods/e37dd0f9-c4f8-4859-a45a-b821c2584b8a/volumes" Mar 18 13:33:21 crc kubenswrapper[4912]: I0318 13:33:20.342591 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"122ef44f-951b-4aa0-bef1-d190f7b5a495","Type":"ContainerStarted","Data":"a6b86066b637c668c25a1f44188cfdc8664fc2665fe198dc0444efa7209e84c6"} Mar 18 13:33:21 crc kubenswrapper[4912]: I0318 13:33:20.342658 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"122ef44f-951b-4aa0-bef1-d190f7b5a495","Type":"ContainerStarted","Data":"c9a2b91c972f4936646b92c436693b5f8ff12d2635a7ff35b8714f2cec168b3c"} Mar 18 13:33:21 crc kubenswrapper[4912]: W0318 13:33:20.469247 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04db868_dfd5_464a_97c3_437a011e243a.slice/crio-52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b WatchSource:0}: Error finding container 52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b: Status 404 returned error can't find the container with id 52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b Mar 18 13:33:21 crc kubenswrapper[4912]: I0318 13:33:20.472879 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49"] Mar 18 13:33:21 crc kubenswrapper[4912]: I0318 13:33:21.359088 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" event={"ID":"c04db868-dfd5-464a-97c3-437a011e243a","Type":"ContainerStarted","Data":"52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b"} Mar 18 13:33:22 crc kubenswrapper[4912]: I0318 13:33:22.377696 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" event={"ID":"c04db868-dfd5-464a-97c3-437a011e243a","Type":"ContainerStarted","Data":"291d13c183739b05aa8277a4bce51b96c6b71cce09618e00905140c8a243ccd4"} Mar 18 13:33:22 crc kubenswrapper[4912]: I0318 13:33:22.380350 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"122ef44f-951b-4aa0-bef1-d190f7b5a495","Type":"ContainerStarted","Data":"d00abe93d28c615f2ab9b96fb28f844350d1953dbd54cdf46d3168f7f62f8574"} Mar 18 13:33:22 crc kubenswrapper[4912]: I0318 13:33:22.414662 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" podStartSLOduration=2.829361167 podStartE2EDuration="3.414640761s" podCreationTimestamp="2026-03-18 13:33:19 +0000 UTC" firstStartedPulling="2026-03-18 13:33:20.472616453 +0000 UTC m=+1848.932043878" lastFinishedPulling="2026-03-18 13:33:21.057896047 +0000 UTC m=+1849.517323472" observedRunningTime="2026-03-18 13:33:22.396410951 +0000 UTC m=+1850.855838396" watchObservedRunningTime="2026-03-18 13:33:22.414640761 +0000 UTC m=+1850.874068186" Mar 18 13:33:23 crc kubenswrapper[4912]: I0318 13:33:23.407889 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"122ef44f-951b-4aa0-bef1-d190f7b5a495","Type":"ContainerStarted","Data":"48d951f9269f636dc2bb5403fde5971a62b4fd72bf82079fb7448d0ad7fb092c"} Mar 18 13:33:24 crc kubenswrapper[4912]: I0318 13:33:24.426671 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"122ef44f-951b-4aa0-bef1-d190f7b5a495","Type":"ContainerStarted","Data":"da630359643231a766b4be29178a2c233192e58be86239c7f7eb7d4e78c95834"} Mar 18 13:33:30 crc kubenswrapper[4912]: I0318 13:33:30.229603 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:33:30 crc kubenswrapper[4912]: E0318 13:33:30.230984 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:33:45 crc kubenswrapper[4912]: I0318 13:33:45.228348 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:33:45 crc kubenswrapper[4912]: E0318 13:33:45.229524 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:33:49 crc kubenswrapper[4912]: I0318 13:33:49.842330 4912 generic.go:334] "Generic (PLEG): container finished" podID="db9357ac-df66-4e60-bddf-38d4f8847623" containerID="d42dae28db0a37386bc6460f141dcc53b4f42dc060bd57fa0c6a7600b9fbe330" exitCode=0 Mar 18 13:33:49 crc kubenswrapper[4912]: I0318 13:33:49.842412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"db9357ac-df66-4e60-bddf-38d4f8847623","Type":"ContainerDied","Data":"d42dae28db0a37386bc6460f141dcc53b4f42dc060bd57fa0c6a7600b9fbe330"} Mar 18 13:33:49 crc kubenswrapper[4912]: I0318 13:33:49.878308 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=27.526018489 podStartE2EDuration="31.878276953s" podCreationTimestamp="2026-03-18 13:33:18 +0000 UTC" firstStartedPulling="2026-03-18 13:33:19.389367311 +0000 UTC m=+1847.848794736" lastFinishedPulling="2026-03-18 13:33:23.741625775 +0000 UTC m=+1852.201053200" observedRunningTime="2026-03-18 13:33:24.477508458 +0000 UTC m=+1852.936935903" watchObservedRunningTime="2026-03-18 13:33:49.878276953 +0000 UTC m=+1878.337704378" Mar 18 13:33:50 crc kubenswrapper[4912]: I0318 13:33:50.859902 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"db9357ac-df66-4e60-bddf-38d4f8847623","Type":"ContainerStarted","Data":"10c2c17232a133d8cbe51aa29b4d04b81c508729c25393296105be897f63643f"} Mar 18 13:33:50 crc kubenswrapper[4912]: I0318 13:33:50.861297 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 18 13:33:50 crc kubenswrapper[4912]: I0318 13:33:50.908810 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=36.908783066 podStartE2EDuration="36.908783066s" podCreationTimestamp="2026-03-18 13:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:33:50.891176413 +0000 UTC m=+1879.350603868" watchObservedRunningTime="2026-03-18 13:33:50.908783066 +0000 UTC m=+1879.368210491" Mar 18 13:33:58 crc kubenswrapper[4912]: I0318 13:33:58.705736 4912 scope.go:117] "RemoveContainer" containerID="51c4a7ce9f97958dc39024e55b9c6f3c087a3e537706719be3968d3a3420fe9b" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.169161 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564014-gz98k"] Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.171565 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.174743 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.175566 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.176187 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.187050 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-gz98k"] Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.229404 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:34:00 crc kubenswrapper[4912]: E0318 13:34:00.230073 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.250793 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nj5h\" (UniqueName: \"kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h\") pod \"auto-csr-approver-29564014-gz98k\" (UID: \"e20507bb-693e-4deb-b781-b1358d0c9871\") " pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.355634 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nj5h\" (UniqueName: \"kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h\") pod \"auto-csr-approver-29564014-gz98k\" (UID: \"e20507bb-693e-4deb-b781-b1358d0c9871\") " pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.396544 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nj5h\" (UniqueName: \"kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h\") pod \"auto-csr-approver-29564014-gz98k\" (UID: \"e20507bb-693e-4deb-b781-b1358d0c9871\") " pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:00 crc kubenswrapper[4912]: I0318 13:34:00.502703 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:01 crc kubenswrapper[4912]: I0318 13:34:01.123419 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-gz98k"] Mar 18 13:34:02 crc kubenswrapper[4912]: I0318 13:34:02.018589 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-gz98k" event={"ID":"e20507bb-693e-4deb-b781-b1358d0c9871","Type":"ContainerStarted","Data":"2ecce9ec4abc38c0527a2f8d2ccf6a8c1c7dafbb6d39110dd7d7c737e587a951"} Mar 18 13:34:03 crc kubenswrapper[4912]: I0318 13:34:03.037353 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-gz98k" event={"ID":"e20507bb-693e-4deb-b781-b1358d0c9871","Type":"ContainerStarted","Data":"628b8ab9fc3c79747a5a94520aec930f980fa9a8222a978992336e29d4e52e87"} Mar 18 13:34:03 crc kubenswrapper[4912]: I0318 13:34:03.064731 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564014-gz98k" podStartSLOduration=2.101061628 podStartE2EDuration="3.064706791s" podCreationTimestamp="2026-03-18 13:34:00 +0000 UTC" firstStartedPulling="2026-03-18 13:34:01.126196982 +0000 UTC m=+1889.585624407" lastFinishedPulling="2026-03-18 13:34:02.089842145 +0000 UTC m=+1890.549269570" observedRunningTime="2026-03-18 13:34:03.054957708 +0000 UTC m=+1891.514385183" watchObservedRunningTime="2026-03-18 13:34:03.064706791 +0000 UTC m=+1891.524134216" Mar 18 13:34:04 crc kubenswrapper[4912]: I0318 13:34:04.057606 4912 generic.go:334] "Generic (PLEG): container finished" podID="e20507bb-693e-4deb-b781-b1358d0c9871" containerID="628b8ab9fc3c79747a5a94520aec930f980fa9a8222a978992336e29d4e52e87" exitCode=0 Mar 18 13:34:04 crc kubenswrapper[4912]: I0318 13:34:04.058017 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-gz98k" event={"ID":"e20507bb-693e-4deb-b781-b1358d0c9871","Type":"ContainerDied","Data":"628b8ab9fc3c79747a5a94520aec930f980fa9a8222a978992336e29d4e52e87"} Mar 18 13:34:04 crc kubenswrapper[4912]: I0318 13:34:04.659426 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 18 13:34:04 crc kubenswrapper[4912]: I0318 13:34:04.725470 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:05 crc kubenswrapper[4912]: I0318 13:34:05.743441 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:05 crc kubenswrapper[4912]: I0318 13:34:05.848730 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nj5h\" (UniqueName: \"kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h\") pod \"e20507bb-693e-4deb-b781-b1358d0c9871\" (UID: \"e20507bb-693e-4deb-b781-b1358d0c9871\") " Mar 18 13:34:05 crc kubenswrapper[4912]: I0318 13:34:05.859003 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h" (OuterVolumeSpecName: "kube-api-access-2nj5h") pod "e20507bb-693e-4deb-b781-b1358d0c9871" (UID: "e20507bb-693e-4deb-b781-b1358d0c9871"). InnerVolumeSpecName "kube-api-access-2nj5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:05 crc kubenswrapper[4912]: I0318 13:34:05.953281 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nj5h\" (UniqueName: \"kubernetes.io/projected/e20507bb-693e-4deb-b781-b1358d0c9871-kube-api-access-2nj5h\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:06 crc kubenswrapper[4912]: I0318 13:34:06.102605 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-gz98k" event={"ID":"e20507bb-693e-4deb-b781-b1358d0c9871","Type":"ContainerDied","Data":"2ecce9ec4abc38c0527a2f8d2ccf6a8c1c7dafbb6d39110dd7d7c737e587a951"} Mar 18 13:34:06 crc kubenswrapper[4912]: I0318 13:34:06.102654 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecce9ec4abc38c0527a2f8d2ccf6a8c1c7dafbb6d39110dd7d7c737e587a951" Mar 18 13:34:06 crc kubenswrapper[4912]: I0318 13:34:06.102725 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-gz98k" Mar 18 13:34:06 crc kubenswrapper[4912]: I0318 13:34:06.861610 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-fqc26"] Mar 18 13:34:06 crc kubenswrapper[4912]: I0318 13:34:06.876552 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-fqc26"] Mar 18 13:34:08 crc kubenswrapper[4912]: I0318 13:34:08.244307 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284c4dc7-ec24-4baa-91e7-f0540ed73054" path="/var/lib/kubelet/pods/284c4dc7-ec24-4baa-91e7-f0540ed73054/volumes" Mar 18 13:34:10 crc kubenswrapper[4912]: I0318 13:34:10.307899 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" containerID="cri-o://e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787" gracePeriod=604795 Mar 18 13:34:14 crc kubenswrapper[4912]: I0318 13:34:14.228749 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:34:14 crc kubenswrapper[4912]: E0318 13:34:14.230165 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:34:14 crc kubenswrapper[4912]: I0318 13:34:14.994979 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.062294 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.211114 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgpx\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.211256 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212217 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212376 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212559 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212609 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212666 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.212692 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.213559 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.213609 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.213645 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.213845 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd\") pod \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\" (UID: \"e25d0c6c-24af-4cb6-b961-ae312ec23df9\") " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.214206 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.215064 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.235343 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.235691 4912 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.236115 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.243323 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.243441 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.245542 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info" (OuterVolumeSpecName: "pod-info") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.250972 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx" (OuterVolumeSpecName: "kube-api-access-2rgpx") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "kube-api-access-2rgpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.268140 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf" (OuterVolumeSpecName: "persistence") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.280133 4912 generic.go:334] "Generic (PLEG): container finished" podID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerID="e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787" exitCode=0 Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.280188 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerDied","Data":"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787"} Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.280221 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e25d0c6c-24af-4cb6-b961-ae312ec23df9","Type":"ContainerDied","Data":"b9ad57e4bc13d286c13ff050cb1e331aef87770cd4bcf9771cfd32e16458286e"} Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.280240 4912 scope.go:117] "RemoveContainer" containerID="e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.280372 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.287165 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data" (OuterVolumeSpecName: "config-data") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.330010 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf" (OuterVolumeSpecName: "server-conf") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.339410 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") on node \"crc\" " Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341866 4912 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341906 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e25d0c6c-24af-4cb6-b961-ae312ec23df9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341916 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341930 4912 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e25d0c6c-24af-4cb6-b961-ae312ec23df9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341943 4912 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e25d0c6c-24af-4cb6-b961-ae312ec23df9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.341959 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgpx\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-kube-api-access-2rgpx\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.390106 4912 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.390353 4912 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf") on node "crc" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.402419 4912 scope.go:117] "RemoveContainer" containerID="6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.413741 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e25d0c6c-24af-4cb6-b961-ae312ec23df9" (UID: "e25d0c6c-24af-4cb6-b961-ae312ec23df9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.445620 4912 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e25d0c6c-24af-4cb6-b961-ae312ec23df9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.445667 4912 reconciler_common.go:293] "Volume detached for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.472375 4912 scope.go:117] "RemoveContainer" containerID="e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787" Mar 18 13:34:17 crc kubenswrapper[4912]: E0318 13:34:17.472905 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787\": container with ID starting with e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787 not found: ID does not exist" containerID="e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.472940 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787"} err="failed to get container status \"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787\": rpc error: code = NotFound desc = could not find container \"e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787\": container with ID starting with e2cec7c16eb28b65e5769b865b024a908e0eed2353e7ee75585091b5b4871787 not found: ID does not exist" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.472964 4912 scope.go:117] "RemoveContainer" containerID="6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1" Mar 18 13:34:17 crc kubenswrapper[4912]: E0318 13:34:17.473345 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1\": container with ID starting with 6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1 not found: ID does not exist" containerID="6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.473429 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1"} err="failed to get container status \"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1\": rpc error: code = NotFound desc = could not find container \"6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1\": container with ID starting with 6d39e29740850b828262b59f552cac8a32d4d9f245aac5a6224ce9e9928e78a1 not found: ID does not exist" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.633297 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.655169 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.680942 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:17 crc kubenswrapper[4912]: E0318 13:34:17.681600 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20507bb-693e-4deb-b781-b1358d0c9871" containerName="oc" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.681629 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20507bb-693e-4deb-b781-b1358d0c9871" containerName="oc" Mar 18 13:34:17 crc kubenswrapper[4912]: E0318 13:34:17.681675 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.681684 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" Mar 18 13:34:17 crc kubenswrapper[4912]: E0318 13:34:17.681725 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="setup-container" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.681734 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="setup-container" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.681999 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" containerName="rabbitmq" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.682030 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20507bb-693e-4deb-b781-b1358d0c9871" containerName="oc" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.690999 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.714331 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859147 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859344 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrtz\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-kube-api-access-5lrtz\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859403 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859460 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859569 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859606 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859678 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859755 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859820 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.859884 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.860249 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.962907 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.962986 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963074 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963163 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963212 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrtz\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-kube-api-access-5lrtz\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963234 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963253 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963284 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963303 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963327 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.963351 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.964262 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.964594 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.965539 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.965743 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.966797 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.967385 4912 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.967421 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa201d4246c012acd5c41babb6a867c820a838e5b585b432c358397deecb8fad/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.970957 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.970964 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.971828 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.974221 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:17 crc kubenswrapper[4912]: I0318 13:34:17.989135 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrtz\" (UniqueName: \"kubernetes.io/projected/775a5a2c-1365-4984-9e9f-a11cd7f48bb9-kube-api-access-5lrtz\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:18 crc kubenswrapper[4912]: I0318 13:34:18.075191 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1cc3487-b305-4acc-84c2-7b8dd20f80bf\") pod \"rabbitmq-server-0\" (UID: \"775a5a2c-1365-4984-9e9f-a11cd7f48bb9\") " pod="openstack/rabbitmq-server-0" Mar 18 13:34:18 crc kubenswrapper[4912]: I0318 13:34:18.290398 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25d0c6c-24af-4cb6-b961-ae312ec23df9" path="/var/lib/kubelet/pods/e25d0c6c-24af-4cb6-b961-ae312ec23df9/volumes" Mar 18 13:34:18 crc kubenswrapper[4912]: I0318 13:34:18.330149 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:34:18 crc kubenswrapper[4912]: I0318 13:34:18.936591 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:34:19 crc kubenswrapper[4912]: I0318 13:34:19.315653 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"775a5a2c-1365-4984-9e9f-a11cd7f48bb9","Type":"ContainerStarted","Data":"5501a414fe9f147f78ba273ff59a04b37b976a10bc980ab59f026999ca67efdc"} Mar 18 13:34:21 crc kubenswrapper[4912]: I0318 13:34:21.345405 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"775a5a2c-1365-4984-9e9f-a11cd7f48bb9","Type":"ContainerStarted","Data":"9f6bd47889c666b79125e5cd2e55437a0ee5ef7e1f47e2fca08f9504be36b2f8"} Mar 18 13:34:25 crc kubenswrapper[4912]: I0318 13:34:25.229411 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:34:25 crc kubenswrapper[4912]: E0318 13:34:25.230684 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:34:40 crc kubenswrapper[4912]: I0318 13:34:40.232225 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:34:40 crc kubenswrapper[4912]: E0318 13:34:40.233330 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:34:53 crc kubenswrapper[4912]: I0318 13:34:53.834240 4912 generic.go:334] "Generic (PLEG): container finished" podID="775a5a2c-1365-4984-9e9f-a11cd7f48bb9" containerID="9f6bd47889c666b79125e5cd2e55437a0ee5ef7e1f47e2fca08f9504be36b2f8" exitCode=0 Mar 18 13:34:53 crc kubenswrapper[4912]: I0318 13:34:53.834362 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"775a5a2c-1365-4984-9e9f-a11cd7f48bb9","Type":"ContainerDied","Data":"9f6bd47889c666b79125e5cd2e55437a0ee5ef7e1f47e2fca08f9504be36b2f8"} Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.080311 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-d4t94"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.095297 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-w9gjq"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.108724 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f5d9-account-create-update-wrz5v"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.127441 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-d4t94"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.143076 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-34ea-account-create-update-jhclb"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.156618 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l67rx"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.192363 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8eb0-account-create-update-mls4v"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.221341 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f5d9-account-create-update-wrz5v"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.285879 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef449253-2c62-47bf-a6aa-513c1c28de28" path="/var/lib/kubelet/pods/ef449253-2c62-47bf-a6aa-513c1c28de28/volumes" Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.315074 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a595f8-7a9b-4c81-ab92-71a0618740e3" path="/var/lib/kubelet/pods/f3a595f8-7a9b-4c81-ab92-71a0618740e3/volumes" Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.317058 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-34ea-account-create-update-jhclb"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.317110 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-w9gjq"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.317131 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l67rx"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.317147 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8eb0-account-create-update-mls4v"] Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.851966 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"775a5a2c-1365-4984-9e9f-a11cd7f48bb9","Type":"ContainerStarted","Data":"11520d93be836e4ac825ce2b68de149425fc121546463a299b398af0e6fbaf3f"} Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.853877 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 13:34:54 crc kubenswrapper[4912]: I0318 13:34:54.887127 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.887009993 podStartE2EDuration="37.887009993s" podCreationTimestamp="2026-03-18 13:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:34:54.886347505 +0000 UTC m=+1943.345774950" watchObservedRunningTime="2026-03-18 13:34:54.887009993 +0000 UTC m=+1943.346437418" Mar 18 13:34:55 crc kubenswrapper[4912]: I0318 13:34:55.056671 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c8cdk"] Mar 18 13:34:55 crc kubenswrapper[4912]: I0318 13:34:55.073087 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c8cdk"] Mar 18 13:34:55 crc kubenswrapper[4912]: I0318 13:34:55.227751 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:34:55 crc kubenswrapper[4912]: E0318 13:34:55.228142 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.041899 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e14e-account-create-update-jt7n6"] Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.056502 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e14e-account-create-update-jt7n6"] Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.244554 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218793dd-ce60-49c1-87b3-43176d51e230" path="/var/lib/kubelet/pods/218793dd-ce60-49c1-87b3-43176d51e230/volumes" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.248676 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec" path="/var/lib/kubelet/pods/39ecc8d4-f5f7-4536-9cd6-f9c6e6d7ebec/volumes" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.250302 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821a74e9-e276-47c9-8401-7de9010901bf" path="/var/lib/kubelet/pods/821a74e9-e276-47c9-8401-7de9010901bf/volumes" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.252638 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a03ef05-1ed6-4212-b1b6-c904231640f8" path="/var/lib/kubelet/pods/8a03ef05-1ed6-4212-b1b6-c904231640f8/volumes" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.255206 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc016d4-4749-4456-92a5-b66653a5ef44" path="/var/lib/kubelet/pods/9fc016d4-4749-4456-92a5-b66653a5ef44/volumes" Mar 18 13:34:56 crc kubenswrapper[4912]: I0318 13:34:56.257765 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ed6e1a-405b-4599-8c25-abb534946198" path="/var/lib/kubelet/pods/e6ed6e1a-405b-4599-8c25-abb534946198/volumes" Mar 18 13:34:58 crc kubenswrapper[4912]: I0318 13:34:58.872134 4912 scope.go:117] "RemoveContainer" containerID="d123da7fc856b3287879b131319b8b60419c82b22c02106327aba5b7ab445734" Mar 18 13:34:58 crc kubenswrapper[4912]: I0318 13:34:58.936579 4912 scope.go:117] "RemoveContainer" containerID="fa7b12ecc7c712600e34cec47150811b9f52ca500907a4dbe254b7e58775c327" Mar 18 13:34:58 crc kubenswrapper[4912]: I0318 13:34:58.986593 4912 scope.go:117] "RemoveContainer" containerID="b51340747b1ea1d80e2f5c93d896ea9566f503f0a45e533a8553bb438e865f0a" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.052990 4912 scope.go:117] "RemoveContainer" containerID="3976496a9893de86d041784048f05dd3695668c9d8dbfb057d025fa32bc745f8" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.116322 4912 scope.go:117] "RemoveContainer" containerID="400cdf6a5f4d818bb783d661246690a10d698896cc7940a218a9380b1378cc87" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.184366 4912 scope.go:117] "RemoveContainer" containerID="ac8e1b96612ce2bbc73571c2b84205073b8a8d919dfad7f790dc49c65dbde137" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.250759 4912 scope.go:117] "RemoveContainer" containerID="3021e363fa6e06db15905c89f5fad26489c0f3a64358ea0805af2b0d25105304" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.281766 4912 scope.go:117] "RemoveContainer" containerID="3ce2b7e9a4d84c9e08cdcead47663224e500e4dcf6e29ab26f1b66bfe95be8b4" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.319495 4912 scope.go:117] "RemoveContainer" containerID="91206e7c251655068ce028fb1d382d87d4d66cd7c4e9d12c0e586bf6aff6ba57" Mar 18 13:34:59 crc kubenswrapper[4912]: I0318 13:34:59.360749 4912 scope.go:117] "RemoveContainer" containerID="dff4a79f7afa21763a6f27f98e7a28b52b90389b22866c2264a4fb27db600efd" Mar 18 13:35:06 crc kubenswrapper[4912]: I0318 13:35:06.228696 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:35:06 crc kubenswrapper[4912]: E0318 13:35:06.229656 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:35:08 crc kubenswrapper[4912]: I0318 13:35:08.335335 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.049701 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-cvs82"] Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.073146 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-28d7-account-create-update-f7776"] Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.115691 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-cvs82"] Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.133127 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-28d7-account-create-update-f7776"] Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.244159 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f5291e-16e8-4832-925e-05e2e5406607" path="/var/lib/kubelet/pods/01f5291e-16e8-4832-925e-05e2e5406607/volumes" Mar 18 13:35:12 crc kubenswrapper[4912]: I0318 13:35:12.245443 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885ba47e-6a32-4a32-86c0-a6dbb63c33b0" path="/var/lib/kubelet/pods/885ba47e-6a32-4a32-86c0-a6dbb63c33b0/volumes" Mar 18 13:35:18 crc kubenswrapper[4912]: I0318 13:35:18.229017 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:35:18 crc kubenswrapper[4912]: E0318 13:35:18.230521 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:35:32 crc kubenswrapper[4912]: I0318 13:35:32.243615 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:35:32 crc kubenswrapper[4912]: E0318 13:35:32.245098 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:35:35 crc kubenswrapper[4912]: I0318 13:35:35.055728 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q2jgz"] Mar 18 13:35:35 crc kubenswrapper[4912]: I0318 13:35:35.065756 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q2jgz"] Mar 18 13:35:36 crc kubenswrapper[4912]: I0318 13:35:36.245987 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2" path="/var/lib/kubelet/pods/4bb13f5d-2aaa-4efb-b7a5-4e3a477d15f2/volumes" Mar 18 13:35:44 crc kubenswrapper[4912]: I0318 13:35:44.228888 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:35:44 crc kubenswrapper[4912]: E0318 13:35:44.229642 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.075745 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rbnsb"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.094207 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-52ed-account-create-update-d48pr"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.115248 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ebf9-account-create-update-8tl2r"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.135207 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rbnsb"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.151806 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-skzn7"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.165110 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-de81-account-create-update-mxs2s"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.180006 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-52ed-account-create-update-d48pr"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.193501 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ebf9-account-create-update-8tl2r"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.206059 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-55c0-account-create-update-dg7pc"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.218864 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-55c0-account-create-update-dg7pc"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.249261 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eef913b-f65b-41a9-b0fa-3463914463f5" path="/var/lib/kubelet/pods/4eef913b-f65b-41a9-b0fa-3463914463f5/volumes" Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.255222 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa24fe7-cd66-47b6-9154-101f961c8482" path="/var/lib/kubelet/pods/4fa24fe7-cd66-47b6-9154-101f961c8482/volumes" Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.258275 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb39d68-1138-410e-9577-197e9ff4b0c5" path="/var/lib/kubelet/pods/6cb39d68-1138-410e-9577-197e9ff4b0c5/volumes" Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.267754 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7327a84-0a21-4528-bd67-8a43d103e004" path="/var/lib/kubelet/pods/d7327a84-0a21-4528-bd67-8a43d103e004/volumes" Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.270191 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-de81-account-create-update-mxs2s"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.270259 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xpx4j"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.270278 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-skzn7"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.304354 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8cwkc"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.323327 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8cwkc"] Mar 18 13:35:48 crc kubenswrapper[4912]: I0318 13:35:48.340634 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xpx4j"] Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.108792 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.112821 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.122202 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.248493 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1" path="/var/lib/kubelet/pods/a3b22c11-01a0-4b5d-99ab-a1c4195cbdc1/volumes" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.249800 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5da76f4-5031-4a81-ae19-96d01814f859" path="/var/lib/kubelet/pods/a5da76f4-5031-4a81-ae19-96d01814f859/volumes" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.250594 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6588379-d349-492e-a673-8f75b93fd640" path="/var/lib/kubelet/pods/f6588379-d349-492e-a673-8f75b93fd640/volumes" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.252550 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb2aa83-efef-4845-bfd5-ae8bf926f515" path="/var/lib/kubelet/pods/ffb2aa83-efef-4845-bfd5-ae8bf926f515/volumes" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.255817 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxbl9\" (UniqueName: \"kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.255959 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.256043 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.358515 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxbl9\" (UniqueName: \"kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.358683 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.358785 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.359839 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.360185 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.384373 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxbl9\" (UniqueName: \"kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9\") pod \"redhat-marketplace-rd646\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:50 crc kubenswrapper[4912]: I0318 13:35:50.451733 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:35:51 crc kubenswrapper[4912]: I0318 13:35:51.064518 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:35:51 crc kubenswrapper[4912]: I0318 13:35:51.614873 4912 generic.go:334] "Generic (PLEG): container finished" podID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerID="d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3" exitCode=0 Mar 18 13:35:51 crc kubenswrapper[4912]: I0318 13:35:51.615005 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerDied","Data":"d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3"} Mar 18 13:35:51 crc kubenswrapper[4912]: I0318 13:35:51.615407 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerStarted","Data":"25c358e2d99427544bcc72e6f7ee93a9a45ad7c41393d776830df159cdafd5ca"} Mar 18 13:35:52 crc kubenswrapper[4912]: I0318 13:35:52.640584 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerStarted","Data":"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120"} Mar 18 13:35:54 crc kubenswrapper[4912]: I0318 13:35:54.676221 4912 generic.go:334] "Generic (PLEG): container finished" podID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerID="bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120" exitCode=0 Mar 18 13:35:54 crc kubenswrapper[4912]: I0318 13:35:54.677161 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerDied","Data":"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120"} Mar 18 13:35:55 crc kubenswrapper[4912]: I0318 13:35:55.691374 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerStarted","Data":"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd"} Mar 18 13:35:55 crc kubenswrapper[4912]: I0318 13:35:55.729366 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rd646" podStartSLOduration=2.150357615 podStartE2EDuration="5.729296387s" podCreationTimestamp="2026-03-18 13:35:50 +0000 UTC" firstStartedPulling="2026-03-18 13:35:51.6178645 +0000 UTC m=+2000.077291925" lastFinishedPulling="2026-03-18 13:35:55.196803282 +0000 UTC m=+2003.656230697" observedRunningTime="2026-03-18 13:35:55.711789415 +0000 UTC m=+2004.171216840" watchObservedRunningTime="2026-03-18 13:35:55.729296387 +0000 UTC m=+2004.188723812" Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.043861 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-krlbd"] Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.058000 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-krlbd"] Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.075211 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dhhd8"] Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.091017 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dhhd8"] Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.243209 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92c61dc-cfdf-4610-81b7-553c9882fc26" path="/var/lib/kubelet/pods/a92c61dc-cfdf-4610-81b7-553c9882fc26/volumes" Mar 18 13:35:56 crc kubenswrapper[4912]: I0318 13:35:56.245113 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77cb1e2-3c24-41cb-95fa-ff54327ae194" path="/var/lib/kubelet/pods/c77cb1e2-3c24-41cb-95fa-ff54327ae194/volumes" Mar 18 13:35:58 crc kubenswrapper[4912]: I0318 13:35:58.228438 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:35:58 crc kubenswrapper[4912]: E0318 13:35:58.229247 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:35:59 crc kubenswrapper[4912]: I0318 13:35:59.701457 4912 scope.go:117] "RemoveContainer" containerID="d855dc2b8f4b2b41840914014e8856f43312b10cfe76c56df63f499bd087d986" Mar 18 13:35:59 crc kubenswrapper[4912]: I0318 13:35:59.753374 4912 scope.go:117] "RemoveContainer" containerID="ebed85dc94f5fa30b16e2bc5ae45567364cac108b0a10991c181baaff476792d" Mar 18 13:35:59 crc kubenswrapper[4912]: I0318 13:35:59.807073 4912 scope.go:117] "RemoveContainer" containerID="2708cfeb2aa827f9b8643cd77458cf9445120d7378b999ce6dd06aeb73d28516" Mar 18 13:35:59 crc kubenswrapper[4912]: I0318 13:35:59.874447 4912 scope.go:117] "RemoveContainer" containerID="1ab04ce43a352a0bacc29661f56e9bbdaefe6338acde430dab65052025f4a6d5" Mar 18 13:35:59 crc kubenswrapper[4912]: I0318 13:35:59.938262 4912 scope.go:117] "RemoveContainer" containerID="93d0dbe94a524ec350f585f38bde45d37b221bde679bcb8649d08e11f3d416e7" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.013688 4912 scope.go:117] "RemoveContainer" containerID="6a218816a0a7c683763c056b96e0291cd8d36ab109896fc9fefabe9abaed5fb6" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.080419 4912 scope.go:117] "RemoveContainer" containerID="c8664325e9a061c4521e14f190f2d5fa603154ae6abaaee3f76074f65dff12f3" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.114891 4912 scope.go:117] "RemoveContainer" containerID="7b3f3e459c9334390333b26ef9e0b392b8bf6f6ab79f21810221ba342e4bc120" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.159943 4912 scope.go:117] "RemoveContainer" containerID="9efea96349526457502ef8478fd55e31eff36e99f3b44dc8234296e34c9d35f0" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.171969 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564016-fsw7b"] Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.174722 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.179584 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.179628 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.179995 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.187222 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-fsw7b"] Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.197179 4912 scope.go:117] "RemoveContainer" containerID="15d797596cc9e78fc03d11182af1b80b02c6fc5b0719c282ffe9eb8047503b8d" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.248305 4912 scope.go:117] "RemoveContainer" containerID="58530107af85bb2832504ec1e89d02bfe399c064c47b7b71709664a0cd2f12f7" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.292109 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68smf\" (UniqueName: \"kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf\") pod \"auto-csr-approver-29564016-fsw7b\" (UID: \"57140d4c-a1cf-431b-81f5-d702dab52543\") " pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.302453 4912 scope.go:117] "RemoveContainer" containerID="f96c6c3d0bc66135ba7daef9263b6cad85ba1e4cdb4a0f14db822fde24af87ce" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.338618 4912 scope.go:117] "RemoveContainer" containerID="cc8193a78a993ab1c9ae50a3bacb9b07cb5c6845be38a01286cf948cccc10050" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.375019 4912 scope.go:117] "RemoveContainer" containerID="e981724626d8d5605b3301d04d3e928226dc72275ca7d1310ea94db1e9da75d5" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.397454 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68smf\" (UniqueName: \"kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf\") pod \"auto-csr-approver-29564016-fsw7b\" (UID: \"57140d4c-a1cf-431b-81f5-d702dab52543\") " pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.404594 4912 scope.go:117] "RemoveContainer" containerID="692160d8e33d62ebef21f2ea31a10e2a6bc99fa31d8ae9ccd0a76088412ca4f3" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.423907 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68smf\" (UniqueName: \"kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf\") pod \"auto-csr-approver-29564016-fsw7b\" (UID: \"57140d4c-a1cf-431b-81f5-d702dab52543\") " pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.452785 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.452828 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:00 crc kubenswrapper[4912]: I0318 13:36:00.502757 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:01 crc kubenswrapper[4912]: I0318 13:36:01.081813 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-fsw7b"] Mar 18 13:36:01 crc kubenswrapper[4912]: I0318 13:36:01.508727 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rd646" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="registry-server" probeResult="failure" output=< Mar 18 13:36:01 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:36:01 crc kubenswrapper[4912]: > Mar 18 13:36:01 crc kubenswrapper[4912]: I0318 13:36:01.789132 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" event={"ID":"57140d4c-a1cf-431b-81f5-d702dab52543","Type":"ContainerStarted","Data":"93d47cd20122479bc3c088020632669023b7a1f2a0760dfdc47f05119833bf65"} Mar 18 13:36:02 crc kubenswrapper[4912]: I0318 13:36:02.803215 4912 generic.go:334] "Generic (PLEG): container finished" podID="57140d4c-a1cf-431b-81f5-d702dab52543" containerID="9e30b972648fb32f4fe04b9acc73cb5d03fcf332ac1b29f167e2a4018ca209cb" exitCode=0 Mar 18 13:36:02 crc kubenswrapper[4912]: I0318 13:36:02.803556 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" event={"ID":"57140d4c-a1cf-431b-81f5-d702dab52543","Type":"ContainerDied","Data":"9e30b972648fb32f4fe04b9acc73cb5d03fcf332ac1b29f167e2a4018ca209cb"} Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.258553 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.328811 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68smf\" (UniqueName: \"kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf\") pod \"57140d4c-a1cf-431b-81f5-d702dab52543\" (UID: \"57140d4c-a1cf-431b-81f5-d702dab52543\") " Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.346756 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf" (OuterVolumeSpecName: "kube-api-access-68smf") pod "57140d4c-a1cf-431b-81f5-d702dab52543" (UID: "57140d4c-a1cf-431b-81f5-d702dab52543"). InnerVolumeSpecName "kube-api-access-68smf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.433972 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68smf\" (UniqueName: \"kubernetes.io/projected/57140d4c-a1cf-431b-81f5-d702dab52543-kube-api-access-68smf\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.853299 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" event={"ID":"57140d4c-a1cf-431b-81f5-d702dab52543","Type":"ContainerDied","Data":"93d47cd20122479bc3c088020632669023b7a1f2a0760dfdc47f05119833bf65"} Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.853840 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d47cd20122479bc3c088020632669023b7a1f2a0760dfdc47f05119833bf65" Mar 18 13:36:04 crc kubenswrapper[4912]: I0318 13:36:04.853577 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-fsw7b" Mar 18 13:36:05 crc kubenswrapper[4912]: I0318 13:36:05.340454 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-wh5fm"] Mar 18 13:36:05 crc kubenswrapper[4912]: I0318 13:36:05.355149 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-wh5fm"] Mar 18 13:36:06 crc kubenswrapper[4912]: I0318 13:36:06.242924 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbccc26-4a01-47c6-a224-7b8355108dfa" path="/var/lib/kubelet/pods/4dbccc26-4a01-47c6-a224-7b8355108dfa/volumes" Mar 18 13:36:10 crc kubenswrapper[4912]: I0318 13:36:10.502319 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:10 crc kubenswrapper[4912]: I0318 13:36:10.568430 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:10 crc kubenswrapper[4912]: I0318 13:36:10.761100 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:36:11 crc kubenswrapper[4912]: I0318 13:36:11.942675 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rd646" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="registry-server" containerID="cri-o://f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd" gracePeriod=2 Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.262811 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:36:12 crc kubenswrapper[4912]: E0318 13:36:12.263933 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.505495 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.603584 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxbl9\" (UniqueName: \"kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9\") pod \"b620a5eb-4f69-4666-9e93-1b00800a07ec\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.603824 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content\") pod \"b620a5eb-4f69-4666-9e93-1b00800a07ec\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.603857 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities\") pod \"b620a5eb-4f69-4666-9e93-1b00800a07ec\" (UID: \"b620a5eb-4f69-4666-9e93-1b00800a07ec\") " Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.605007 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities" (OuterVolumeSpecName: "utilities") pod "b620a5eb-4f69-4666-9e93-1b00800a07ec" (UID: "b620a5eb-4f69-4666-9e93-1b00800a07ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.612302 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9" (OuterVolumeSpecName: "kube-api-access-wxbl9") pod "b620a5eb-4f69-4666-9e93-1b00800a07ec" (UID: "b620a5eb-4f69-4666-9e93-1b00800a07ec"). InnerVolumeSpecName "kube-api-access-wxbl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.645566 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b620a5eb-4f69-4666-9e93-1b00800a07ec" (UID: "b620a5eb-4f69-4666-9e93-1b00800a07ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.707362 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.707411 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620a5eb-4f69-4666-9e93-1b00800a07ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.707429 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxbl9\" (UniqueName: \"kubernetes.io/projected/b620a5eb-4f69-4666-9e93-1b00800a07ec-kube-api-access-wxbl9\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.959304 4912 generic.go:334] "Generic (PLEG): container finished" podID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerID="f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd" exitCode=0 Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.959447 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerDied","Data":"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd"} Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.959521 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd646" Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.960388 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd646" event={"ID":"b620a5eb-4f69-4666-9e93-1b00800a07ec","Type":"ContainerDied","Data":"25c358e2d99427544bcc72e6f7ee93a9a45ad7c41393d776830df159cdafd5ca"} Mar 18 13:36:12 crc kubenswrapper[4912]: I0318 13:36:12.960452 4912 scope.go:117] "RemoveContainer" containerID="f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.012783 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.014517 4912 scope.go:117] "RemoveContainer" containerID="bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.032599 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd646"] Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.056934 4912 scope.go:117] "RemoveContainer" containerID="d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.123209 4912 scope.go:117] "RemoveContainer" containerID="f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd" Mar 18 13:36:13 crc kubenswrapper[4912]: E0318 13:36:13.123986 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd\": container with ID starting with f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd not found: ID does not exist" containerID="f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.124026 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd"} err="failed to get container status \"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd\": rpc error: code = NotFound desc = could not find container \"f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd\": container with ID starting with f8c0b2f2c0fc3eda47c0357b74785599914db5b118c889b06f78ef6d476dbcfd not found: ID does not exist" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.124086 4912 scope.go:117] "RemoveContainer" containerID="bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120" Mar 18 13:36:13 crc kubenswrapper[4912]: E0318 13:36:13.124339 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120\": container with ID starting with bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120 not found: ID does not exist" containerID="bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.124365 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120"} err="failed to get container status \"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120\": rpc error: code = NotFound desc = could not find container \"bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120\": container with ID starting with bbee7897401b7769fd15711c6b8cc2014618d6cc70b226e070a42adf40165120 not found: ID does not exist" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.124384 4912 scope.go:117] "RemoveContainer" containerID="d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3" Mar 18 13:36:13 crc kubenswrapper[4912]: E0318 13:36:13.124905 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3\": container with ID starting with d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3 not found: ID does not exist" containerID="d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3" Mar 18 13:36:13 crc kubenswrapper[4912]: I0318 13:36:13.124930 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3"} err="failed to get container status \"d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3\": rpc error: code = NotFound desc = could not find container \"d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3\": container with ID starting with d947d999f148d07780001d972834eba292c192aec69811d493ab86a3cbcdbdf3 not found: ID does not exist" Mar 18 13:36:14 crc kubenswrapper[4912]: I0318 13:36:14.248182 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" path="/var/lib/kubelet/pods/b620a5eb-4f69-4666-9e93-1b00800a07ec/volumes" Mar 18 13:36:14 crc kubenswrapper[4912]: I0318 13:36:14.992685 4912 generic.go:334] "Generic (PLEG): container finished" podID="c04db868-dfd5-464a-97c3-437a011e243a" containerID="291d13c183739b05aa8277a4bce51b96c6b71cce09618e00905140c8a243ccd4" exitCode=0 Mar 18 13:36:14 crc kubenswrapper[4912]: I0318 13:36:14.992782 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" event={"ID":"c04db868-dfd5-464a-97c3-437a011e243a","Type":"ContainerDied","Data":"291d13c183739b05aa8277a4bce51b96c6b71cce09618e00905140c8a243ccd4"} Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.542221 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.636418 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle\") pod \"c04db868-dfd5-464a-97c3-437a011e243a\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.636727 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkdvj\" (UniqueName: \"kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj\") pod \"c04db868-dfd5-464a-97c3-437a011e243a\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.636868 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam\") pod \"c04db868-dfd5-464a-97c3-437a011e243a\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.636894 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory\") pod \"c04db868-dfd5-464a-97c3-437a011e243a\" (UID: \"c04db868-dfd5-464a-97c3-437a011e243a\") " Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.646039 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c04db868-dfd5-464a-97c3-437a011e243a" (UID: "c04db868-dfd5-464a-97c3-437a011e243a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.648550 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj" (OuterVolumeSpecName: "kube-api-access-lkdvj") pod "c04db868-dfd5-464a-97c3-437a011e243a" (UID: "c04db868-dfd5-464a-97c3-437a011e243a"). InnerVolumeSpecName "kube-api-access-lkdvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.677312 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c04db868-dfd5-464a-97c3-437a011e243a" (UID: "c04db868-dfd5-464a-97c3-437a011e243a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.680292 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory" (OuterVolumeSpecName: "inventory") pod "c04db868-dfd5-464a-97c3-437a011e243a" (UID: "c04db868-dfd5-464a-97c3-437a011e243a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.740734 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkdvj\" (UniqueName: \"kubernetes.io/projected/c04db868-dfd5-464a-97c3-437a011e243a-kube-api-access-lkdvj\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.740990 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.741071 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:16 crc kubenswrapper[4912]: I0318 13:36:16.741181 4912 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04db868-dfd5-464a-97c3-437a011e243a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.024080 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" event={"ID":"c04db868-dfd5-464a-97c3-437a011e243a","Type":"ContainerDied","Data":"52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b"} Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.024186 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ae09e259d8ed4b5cdf2100c5a8810769022d72f172ced108bc4deae6ba8b6b" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.024248 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.119733 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w"] Mar 18 13:36:17 crc kubenswrapper[4912]: E0318 13:36:17.121417 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57140d4c-a1cf-431b-81f5-d702dab52543" containerName="oc" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.121439 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="57140d4c-a1cf-431b-81f5-d702dab52543" containerName="oc" Mar 18 13:36:17 crc kubenswrapper[4912]: E0318 13:36:17.121464 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="registry-server" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.121473 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="registry-server" Mar 18 13:36:17 crc kubenswrapper[4912]: E0318 13:36:17.121507 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04db868-dfd5-464a-97c3-437a011e243a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.121517 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04db868-dfd5-464a-97c3-437a011e243a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 13:36:17 crc kubenswrapper[4912]: E0318 13:36:17.121531 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="extract-content" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.121539 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="extract-content" Mar 18 13:36:17 crc kubenswrapper[4912]: E0318 13:36:17.121580 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="extract-utilities" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.121590 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="extract-utilities" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.122001 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="57140d4c-a1cf-431b-81f5-d702dab52543" containerName="oc" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.122063 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b620a5eb-4f69-4666-9e93-1b00800a07ec" containerName="registry-server" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.122090 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04db868-dfd5-464a-97c3-437a011e243a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.123876 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.128126 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.128127 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.128279 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.128610 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.133382 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w"] Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.257562 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dmz\" (UniqueName: \"kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.258102 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.258485 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.363080 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.363330 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dmz\" (UniqueName: \"kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.363496 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.379912 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.381756 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.389883 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dmz\" (UniqueName: \"kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:17 crc kubenswrapper[4912]: I0318 13:36:17.457335 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:36:18 crc kubenswrapper[4912]: I0318 13:36:18.068270 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w"] Mar 18 13:36:19 crc kubenswrapper[4912]: I0318 13:36:19.055796 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" event={"ID":"c70d046b-5ae9-4514-ac8b-7904ab66c16b","Type":"ContainerStarted","Data":"f479b8d94579fe7a6efc2dacb931accfe139407c88493344d332d20fcf00db7e"} Mar 18 13:36:19 crc kubenswrapper[4912]: I0318 13:36:19.056591 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" event={"ID":"c70d046b-5ae9-4514-ac8b-7904ab66c16b","Type":"ContainerStarted","Data":"1e9118f31b42bf796dbe3cfaff6e62a4f5191c871990738f1050f985c4cb3529"} Mar 18 13:36:19 crc kubenswrapper[4912]: I0318 13:36:19.092029 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" podStartSLOduration=1.6431681519999999 podStartE2EDuration="2.091994298s" podCreationTimestamp="2026-03-18 13:36:17 +0000 UTC" firstStartedPulling="2026-03-18 13:36:18.073875584 +0000 UTC m=+2026.533303009" lastFinishedPulling="2026-03-18 13:36:18.52270173 +0000 UTC m=+2026.982129155" observedRunningTime="2026-03-18 13:36:19.073242552 +0000 UTC m=+2027.532669997" watchObservedRunningTime="2026-03-18 13:36:19.091994298 +0000 UTC m=+2027.551421733" Mar 18 13:36:24 crc kubenswrapper[4912]: I0318 13:36:24.227800 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:36:24 crc kubenswrapper[4912]: E0318 13:36:24.228967 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:36:35 crc kubenswrapper[4912]: I0318 13:36:35.056280 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-6zf76"] Mar 18 13:36:35 crc kubenswrapper[4912]: I0318 13:36:35.068847 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-6zf76"] Mar 18 13:36:36 crc kubenswrapper[4912]: I0318 13:36:36.243885 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c9cb6b-7493-434e-b069-61ce32dcdc95" path="/var/lib/kubelet/pods/39c9cb6b-7493-434e-b069-61ce32dcdc95/volumes" Mar 18 13:36:37 crc kubenswrapper[4912]: I0318 13:36:37.227822 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:36:37 crc kubenswrapper[4912]: E0318 13:36:37.228512 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.061534 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n5nn7"] Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.077454 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-tlb72"] Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.091792 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-tlb72"] Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.106258 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n5nn7"] Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.228749 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:36:50 crc kubenswrapper[4912]: E0318 13:36:50.229092 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.253225 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d3f399-3f5f-4a3a-a7d2-0f9677a4408f" path="/var/lib/kubelet/pods/99d3f399-3f5f-4a3a-a7d2-0f9677a4408f/volumes" Mar 18 13:36:50 crc kubenswrapper[4912]: I0318 13:36:50.261937 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cdd8f3-f85b-4a98-a164-bc9462b4932f" path="/var/lib/kubelet/pods/a4cdd8f3-f85b-4a98-a164-bc9462b4932f/volumes" Mar 18 13:37:00 crc kubenswrapper[4912]: I0318 13:37:00.848015 4912 scope.go:117] "RemoveContainer" containerID="610b5de5bf63bea5146a58b8d390f9f95719eb3d511de8405fc700da8a0f8988" Mar 18 13:37:00 crc kubenswrapper[4912]: I0318 13:37:00.892754 4912 scope.go:117] "RemoveContainer" containerID="cfc4b092fd8cb092417aff68a13ee61fb9866ebb8bb44cd86f45e2a2cca68065" Mar 18 13:37:00 crc kubenswrapper[4912]: I0318 13:37:00.956926 4912 scope.go:117] "RemoveContainer" containerID="78dc5a609a0c2582918480af69b2d8c185274a6637f5c1f26c4d16edc484f34b" Mar 18 13:37:01 crc kubenswrapper[4912]: I0318 13:37:01.049895 4912 scope.go:117] "RemoveContainer" containerID="034f5fad2378d4be86d7c75fbbce5883543c7719a8084de0e56ed92b60e91656" Mar 18 13:37:02 crc kubenswrapper[4912]: I0318 13:37:02.245643 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:37:02 crc kubenswrapper[4912]: E0318 13:37:02.248471 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:37:03 crc kubenswrapper[4912]: I0318 13:37:03.050304 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-d7x2r"] Mar 18 13:37:03 crc kubenswrapper[4912]: I0318 13:37:03.087470 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-d7x2r"] Mar 18 13:37:04 crc kubenswrapper[4912]: I0318 13:37:04.251904 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7732fd2-b813-47e5-8f23-823a3037df09" path="/var/lib/kubelet/pods/e7732fd2-b813-47e5-8f23-823a3037df09/volumes" Mar 18 13:37:09 crc kubenswrapper[4912]: I0318 13:37:09.042805 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fbf2l"] Mar 18 13:37:09 crc kubenswrapper[4912]: I0318 13:37:09.060001 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fbf2l"] Mar 18 13:37:10 crc kubenswrapper[4912]: I0318 13:37:10.244631 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147c4d2b-19d3-48da-9364-c527a1cacc3c" path="/var/lib/kubelet/pods/147c4d2b-19d3-48da-9364-c527a1cacc3c/volumes" Mar 18 13:37:15 crc kubenswrapper[4912]: I0318 13:37:15.229203 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:37:15 crc kubenswrapper[4912]: I0318 13:37:15.751734 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21"} Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.155293 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564018-dnn8m"] Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.158158 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.163488 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.163595 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.163803 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.170838 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-dnn8m"] Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.256087 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvcjx\" (UniqueName: \"kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx\") pod \"auto-csr-approver-29564018-dnn8m\" (UID: \"35be5a84-2210-42f0-9e5c-dcbfcc58dad2\") " pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.359632 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvcjx\" (UniqueName: \"kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx\") pod \"auto-csr-approver-29564018-dnn8m\" (UID: \"35be5a84-2210-42f0-9e5c-dcbfcc58dad2\") " pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.384937 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvcjx\" (UniqueName: \"kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx\") pod \"auto-csr-approver-29564018-dnn8m\" (UID: \"35be5a84-2210-42f0-9e5c-dcbfcc58dad2\") " pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.485666 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.989437 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-dnn8m"] Mar 18 13:38:00 crc kubenswrapper[4912]: I0318 13:38:00.994017 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:38:01 crc kubenswrapper[4912]: I0318 13:38:01.256693 4912 scope.go:117] "RemoveContainer" containerID="010d6616847f86f6d8923fa483a53f6b366513a314ef311a8d3c6d68eb430c5b" Mar 18 13:38:01 crc kubenswrapper[4912]: I0318 13:38:01.293225 4912 scope.go:117] "RemoveContainer" containerID="7d8f21a6020213a9a299dca6955d632607150dff3cf297d54334fdd46836370e" Mar 18 13:38:01 crc kubenswrapper[4912]: I0318 13:38:01.362529 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" event={"ID":"35be5a84-2210-42f0-9e5c-dcbfcc58dad2","Type":"ContainerStarted","Data":"ef2163b9494bc29560e93fbe41548e1f8c175f6987b76e485bd4b8b04a9ed529"} Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.054582 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mpxrh"] Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.070638 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4ce1-account-create-update-b9dnq"] Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.086328 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mpxrh"] Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.113793 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4ce1-account-create-update-b9dnq"] Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.248900 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7499c8bd-8342-41ca-933a-d0975f9d18e5" path="/var/lib/kubelet/pods/7499c8bd-8342-41ca-933a-d0975f9d18e5/volumes" Mar 18 13:38:02 crc kubenswrapper[4912]: I0318 13:38:02.252409 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1108851-b127-4ccb-8c81-bfbe9de7267e" path="/var/lib/kubelet/pods/e1108851-b127-4ccb-8c81-bfbe9de7267e/volumes" Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.054809 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8prkm"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.067158 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e03c-account-create-update-vtvfd"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.080962 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e151-account-create-update-9552s"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.100816 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jhw4z"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.121194 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8prkm"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.137305 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e151-account-create-update-9552s"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.150434 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jhw4z"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.163080 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e03c-account-create-update-vtvfd"] Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.391179 4912 generic.go:334] "Generic (PLEG): container finished" podID="35be5a84-2210-42f0-9e5c-dcbfcc58dad2" containerID="aa655a9ea2f76dc71292ee1353cfbb5ccdb958cd9c458903d5b176394d6c824f" exitCode=0 Mar 18 13:38:03 crc kubenswrapper[4912]: I0318 13:38:03.391237 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" event={"ID":"35be5a84-2210-42f0-9e5c-dcbfcc58dad2","Type":"ContainerDied","Data":"aa655a9ea2f76dc71292ee1353cfbb5ccdb958cd9c458903d5b176394d6c824f"} Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.243227 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6235cc-b662-414c-97c5-0f5b3550d605" path="/var/lib/kubelet/pods/2d6235cc-b662-414c-97c5-0f5b3550d605/volumes" Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.244619 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72532937-9ae3-416d-968f-6a7031ec3055" path="/var/lib/kubelet/pods/72532937-9ae3-416d-968f-6a7031ec3055/volumes" Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.245417 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0303c0-2058-42de-850b-ea7214f3900c" path="/var/lib/kubelet/pods/ba0303c0-2058-42de-850b-ea7214f3900c/volumes" Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.246370 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0d3692-3e21-4b00-9629-f5a4d2140ca9" path="/var/lib/kubelet/pods/cc0d3692-3e21-4b00-9629-f5a4d2140ca9/volumes" Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.412432 4912 generic.go:334] "Generic (PLEG): container finished" podID="c70d046b-5ae9-4514-ac8b-7904ab66c16b" containerID="f479b8d94579fe7a6efc2dacb931accfe139407c88493344d332d20fcf00db7e" exitCode=0 Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.412551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" event={"ID":"c70d046b-5ae9-4514-ac8b-7904ab66c16b","Type":"ContainerDied","Data":"f479b8d94579fe7a6efc2dacb931accfe139407c88493344d332d20fcf00db7e"} Mar 18 13:38:04 crc kubenswrapper[4912]: I0318 13:38:04.879298 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.001905 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvcjx\" (UniqueName: \"kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx\") pod \"35be5a84-2210-42f0-9e5c-dcbfcc58dad2\" (UID: \"35be5a84-2210-42f0-9e5c-dcbfcc58dad2\") " Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.010017 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx" (OuterVolumeSpecName: "kube-api-access-bvcjx") pod "35be5a84-2210-42f0-9e5c-dcbfcc58dad2" (UID: "35be5a84-2210-42f0-9e5c-dcbfcc58dad2"). InnerVolumeSpecName "kube-api-access-bvcjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.105417 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvcjx\" (UniqueName: \"kubernetes.io/projected/35be5a84-2210-42f0-9e5c-dcbfcc58dad2-kube-api-access-bvcjx\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.424175 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.424174 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-dnn8m" event={"ID":"35be5a84-2210-42f0-9e5c-dcbfcc58dad2","Type":"ContainerDied","Data":"ef2163b9494bc29560e93fbe41548e1f8c175f6987b76e485bd4b8b04a9ed529"} Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.425245 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef2163b9494bc29560e93fbe41548e1f8c175f6987b76e485bd4b8b04a9ed529" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.933700 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.958142 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-ddbbq"] Mar 18 13:38:05 crc kubenswrapper[4912]: I0318 13:38:05.971172 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-ddbbq"] Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.027610 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory\") pod \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.027746 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam\") pod \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.027892 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8dmz\" (UniqueName: \"kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz\") pod \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\" (UID: \"c70d046b-5ae9-4514-ac8b-7904ab66c16b\") " Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.051103 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz" (OuterVolumeSpecName: "kube-api-access-s8dmz") pod "c70d046b-5ae9-4514-ac8b-7904ab66c16b" (UID: "c70d046b-5ae9-4514-ac8b-7904ab66c16b"). InnerVolumeSpecName "kube-api-access-s8dmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.069498 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory" (OuterVolumeSpecName: "inventory") pod "c70d046b-5ae9-4514-ac8b-7904ab66c16b" (UID: "c70d046b-5ae9-4514-ac8b-7904ab66c16b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.076198 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c70d046b-5ae9-4514-ac8b-7904ab66c16b" (UID: "c70d046b-5ae9-4514-ac8b-7904ab66c16b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.130650 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.130714 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70d046b-5ae9-4514-ac8b-7904ab66c16b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.130730 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8dmz\" (UniqueName: \"kubernetes.io/projected/c70d046b-5ae9-4514-ac8b-7904ab66c16b-kube-api-access-s8dmz\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.255675 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca1374f-79dd-46c8-8486-def1a8f2b9ef" path="/var/lib/kubelet/pods/eca1374f-79dd-46c8-8486-def1a8f2b9ef/volumes" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.440861 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" event={"ID":"c70d046b-5ae9-4514-ac8b-7904ab66c16b","Type":"ContainerDied","Data":"1e9118f31b42bf796dbe3cfaff6e62a4f5191c871990738f1050f985c4cb3529"} Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.440918 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e9118f31b42bf796dbe3cfaff6e62a4f5191c871990738f1050f985c4cb3529" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.440953 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.585748 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t"] Mar 18 13:38:06 crc kubenswrapper[4912]: E0318 13:38:06.586460 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70d046b-5ae9-4514-ac8b-7904ab66c16b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.586491 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70d046b-5ae9-4514-ac8b-7904ab66c16b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 13:38:06 crc kubenswrapper[4912]: E0318 13:38:06.586537 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35be5a84-2210-42f0-9e5c-dcbfcc58dad2" containerName="oc" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.586547 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="35be5a84-2210-42f0-9e5c-dcbfcc58dad2" containerName="oc" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.586850 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70d046b-5ae9-4514-ac8b-7904ab66c16b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.586868 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="35be5a84-2210-42f0-9e5c-dcbfcc58dad2" containerName="oc" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.587829 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.590725 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.590744 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.590834 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.590891 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.598816 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t"] Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.658817 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rg6m\" (UniqueName: \"kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.658975 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.659433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.760999 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rg6m\" (UniqueName: \"kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.761127 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.761237 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.767706 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.769712 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.780303 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rg6m\" (UniqueName: \"kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:06 crc kubenswrapper[4912]: I0318 13:38:06.915906 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:38:07 crc kubenswrapper[4912]: I0318 13:38:07.563557 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t"] Mar 18 13:38:08 crc kubenswrapper[4912]: I0318 13:38:08.481859 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" event={"ID":"da62fce5-ea10-4763-b58f-81932668abee","Type":"ContainerStarted","Data":"35eba6a1743cc1467916e42292b5964b79ea8ba2265511482dc5e1338e43ae7f"} Mar 18 13:38:08 crc kubenswrapper[4912]: I0318 13:38:08.482726 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" event={"ID":"da62fce5-ea10-4763-b58f-81932668abee","Type":"ContainerStarted","Data":"9652f0e167cedec6a2bba9cf823d38aaed8f21170669baaf3bd7e74aed68076f"} Mar 18 13:38:08 crc kubenswrapper[4912]: I0318 13:38:08.506305 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" podStartSLOduration=2.005380062 podStartE2EDuration="2.506274793s" podCreationTimestamp="2026-03-18 13:38:06 +0000 UTC" firstStartedPulling="2026-03-18 13:38:07.577370269 +0000 UTC m=+2136.036797694" lastFinishedPulling="2026-03-18 13:38:08.078265 +0000 UTC m=+2136.537692425" observedRunningTime="2026-03-18 13:38:08.501171176 +0000 UTC m=+2136.960598611" watchObservedRunningTime="2026-03-18 13:38:08.506274793 +0000 UTC m=+2136.965702228" Mar 18 13:38:10 crc kubenswrapper[4912]: I0318 13:38:10.989489 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6f59b977c9-rwwx4" podUID="08a4effe-9a7e-449c-aba4-74d4b7a4f0ae" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 13:38:25 crc kubenswrapper[4912]: I0318 13:38:25.986543 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6f59b977c9-rwwx4" podUID="08a4effe-9a7e-449c-aba4-74d4b7a4f0ae" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.430875 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.479398 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.479697 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.583902 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9qk\" (UniqueName: \"kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.583984 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.584233 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.687442 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9qk\" (UniqueName: \"kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.687536 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.687666 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.688520 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.688766 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.713676 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9qk\" (UniqueName: \"kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk\") pod \"community-operators-k6kl6\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:27 crc kubenswrapper[4912]: I0318 13:38:27.812792 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:28 crc kubenswrapper[4912]: I0318 13:38:28.511272 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:28 crc kubenswrapper[4912]: I0318 13:38:28.716670 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerStarted","Data":"2c02af00c2c2e68049fbd23724428113299b1911103d355f9ae71f4b2e1f6c9a"} Mar 18 13:38:29 crc kubenswrapper[4912]: I0318 13:38:29.731208 4912 generic.go:334] "Generic (PLEG): container finished" podID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerID="6b4c7e7161201ae5597c8291388c7f5aa54b4237549b4381639a4bfbcac0173f" exitCode=0 Mar 18 13:38:29 crc kubenswrapper[4912]: I0318 13:38:29.731302 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerDied","Data":"6b4c7e7161201ae5597c8291388c7f5aa54b4237549b4381639a4bfbcac0173f"} Mar 18 13:38:30 crc kubenswrapper[4912]: I0318 13:38:30.747295 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerStarted","Data":"e73aa1ffae534c3771ef5a3c5fb1ad386d7c2cf9ed2b358cb42e7be2bc521bf0"} Mar 18 13:38:32 crc kubenswrapper[4912]: I0318 13:38:32.785899 4912 generic.go:334] "Generic (PLEG): container finished" podID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerID="e73aa1ffae534c3771ef5a3c5fb1ad386d7c2cf9ed2b358cb42e7be2bc521bf0" exitCode=0 Mar 18 13:38:32 crc kubenswrapper[4912]: I0318 13:38:32.785999 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerDied","Data":"e73aa1ffae534c3771ef5a3c5fb1ad386d7c2cf9ed2b358cb42e7be2bc521bf0"} Mar 18 13:38:33 crc kubenswrapper[4912]: I0318 13:38:33.803566 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerStarted","Data":"a1e525e4e24f8a3808213a4eae94513d7f4d2ab2a8cf5b1ecf759476b267ab65"} Mar 18 13:38:33 crc kubenswrapper[4912]: I0318 13:38:33.842377 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6kl6" podStartSLOduration=3.351203058 podStartE2EDuration="6.842352224s" podCreationTimestamp="2026-03-18 13:38:27 +0000 UTC" firstStartedPulling="2026-03-18 13:38:29.734006897 +0000 UTC m=+2158.193434322" lastFinishedPulling="2026-03-18 13:38:33.225156043 +0000 UTC m=+2161.684583488" observedRunningTime="2026-03-18 13:38:33.832018677 +0000 UTC m=+2162.291446102" watchObservedRunningTime="2026-03-18 13:38:33.842352224 +0000 UTC m=+2162.301779649" Mar 18 13:38:37 crc kubenswrapper[4912]: I0318 13:38:37.813675 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:37 crc kubenswrapper[4912]: I0318 13:38:37.814534 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:37 crc kubenswrapper[4912]: I0318 13:38:37.876244 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:42 crc kubenswrapper[4912]: I0318 13:38:42.953654 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:42 crc kubenswrapper[4912]: I0318 13:38:42.978371 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.002504 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.086164 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.086389 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.086574 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vskn\" (UniqueName: \"kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.189066 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vskn\" (UniqueName: \"kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.189307 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.189423 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.189893 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.189938 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.226287 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vskn\" (UniqueName: \"kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn\") pod \"certified-operators-6qkgw\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:43 crc kubenswrapper[4912]: I0318 13:38:43.329527 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:44 crc kubenswrapper[4912]: I0318 13:38:44.112339 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:44 crc kubenswrapper[4912]: I0318 13:38:44.944339 4912 generic.go:334] "Generic (PLEG): container finished" podID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerID="024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61" exitCode=0 Mar 18 13:38:44 crc kubenswrapper[4912]: I0318 13:38:44.944711 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerDied","Data":"024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61"} Mar 18 13:38:44 crc kubenswrapper[4912]: I0318 13:38:44.946319 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerStarted","Data":"801620b8f955c39c06483e50a5214ebce1c040836d225c271e088e8bdffcb30c"} Mar 18 13:38:45 crc kubenswrapper[4912]: I0318 13:38:45.959495 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerStarted","Data":"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1"} Mar 18 13:38:47 crc kubenswrapper[4912]: I0318 13:38:47.871083 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:48 crc kubenswrapper[4912]: I0318 13:38:48.000618 4912 generic.go:334] "Generic (PLEG): container finished" podID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerID="86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1" exitCode=0 Mar 18 13:38:48 crc kubenswrapper[4912]: I0318 13:38:48.000670 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerDied","Data":"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1"} Mar 18 13:38:48 crc kubenswrapper[4912]: I0318 13:38:48.498223 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:48 crc kubenswrapper[4912]: I0318 13:38:48.498977 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6kl6" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="registry-server" containerID="cri-o://a1e525e4e24f8a3808213a4eae94513d7f4d2ab2a8cf5b1ecf759476b267ab65" gracePeriod=2 Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.024390 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerStarted","Data":"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41"} Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.030362 4912 generic.go:334] "Generic (PLEG): container finished" podID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerID="a1e525e4e24f8a3808213a4eae94513d7f4d2ab2a8cf5b1ecf759476b267ab65" exitCode=0 Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.030417 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerDied","Data":"a1e525e4e24f8a3808213a4eae94513d7f4d2ab2a8cf5b1ecf759476b267ab65"} Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.074245 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qkgw" podStartSLOduration=3.452210085 podStartE2EDuration="7.074208924s" podCreationTimestamp="2026-03-18 13:38:42 +0000 UTC" firstStartedPulling="2026-03-18 13:38:44.948619023 +0000 UTC m=+2173.408046438" lastFinishedPulling="2026-03-18 13:38:48.570617852 +0000 UTC m=+2177.030045277" observedRunningTime="2026-03-18 13:38:49.065718506 +0000 UTC m=+2177.525145941" watchObservedRunningTime="2026-03-18 13:38:49.074208924 +0000 UTC m=+2177.533636349" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.325471 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.397373 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities\") pod \"ed401f19-95d4-42b1-9e5b-c5b766277eda\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.397653 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9qk\" (UniqueName: \"kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk\") pod \"ed401f19-95d4-42b1-9e5b-c5b766277eda\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.397944 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content\") pod \"ed401f19-95d4-42b1-9e5b-c5b766277eda\" (UID: \"ed401f19-95d4-42b1-9e5b-c5b766277eda\") " Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.402244 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities" (OuterVolumeSpecName: "utilities") pod "ed401f19-95d4-42b1-9e5b-c5b766277eda" (UID: "ed401f19-95d4-42b1-9e5b-c5b766277eda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.410320 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk" (OuterVolumeSpecName: "kube-api-access-ln9qk") pod "ed401f19-95d4-42b1-9e5b-c5b766277eda" (UID: "ed401f19-95d4-42b1-9e5b-c5b766277eda"). InnerVolumeSpecName "kube-api-access-ln9qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.413091 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.497290 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed401f19-95d4-42b1-9e5b-c5b766277eda" (UID: "ed401f19-95d4-42b1-9e5b-c5b766277eda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.519049 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed401f19-95d4-42b1-9e5b-c5b766277eda-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:49 crc kubenswrapper[4912]: I0318 13:38:49.519093 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9qk\" (UniqueName: \"kubernetes.io/projected/ed401f19-95d4-42b1-9e5b-c5b766277eda-kube-api-access-ln9qk\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.056222 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6kl6" event={"ID":"ed401f19-95d4-42b1-9e5b-c5b766277eda","Type":"ContainerDied","Data":"2c02af00c2c2e68049fbd23724428113299b1911103d355f9ae71f4b2e1f6c9a"} Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.056794 4912 scope.go:117] "RemoveContainer" containerID="a1e525e4e24f8a3808213a4eae94513d7f4d2ab2a8cf5b1ecf759476b267ab65" Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.056710 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6kl6" Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.124120 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.143371 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6kl6"] Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.170698 4912 scope.go:117] "RemoveContainer" containerID="e73aa1ffae534c3771ef5a3c5fb1ad386d7c2cf9ed2b358cb42e7be2bc521bf0" Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.272976 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" path="/var/lib/kubelet/pods/ed401f19-95d4-42b1-9e5b-c5b766277eda/volumes" Mar 18 13:38:50 crc kubenswrapper[4912]: I0318 13:38:50.294284 4912 scope.go:117] "RemoveContainer" containerID="6b4c7e7161201ae5597c8291388c7f5aa54b4237549b4381639a4bfbcac0173f" Mar 18 13:38:53 crc kubenswrapper[4912]: I0318 13:38:53.330143 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:53 crc kubenswrapper[4912]: I0318 13:38:53.330724 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:53 crc kubenswrapper[4912]: I0318 13:38:53.401408 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:54 crc kubenswrapper[4912]: I0318 13:38:54.212564 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:54 crc kubenswrapper[4912]: I0318 13:38:54.498717 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.162217 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qkgw" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="registry-server" containerID="cri-o://9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41" gracePeriod=2 Mar 18 13:38:56 crc kubenswrapper[4912]: E0318 13:38:56.508350 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb0ebac_7f1a_4d6d_8aba_499a52973d7c.slice/crio-conmon-9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.746072 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.857397 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content\") pod \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.857480 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities\") pod \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.857559 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vskn\" (UniqueName: \"kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn\") pod \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\" (UID: \"acb0ebac-7f1a-4d6d-8aba-499a52973d7c\") " Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.859926 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities" (OuterVolumeSpecName: "utilities") pod "acb0ebac-7f1a-4d6d-8aba-499a52973d7c" (UID: "acb0ebac-7f1a-4d6d-8aba-499a52973d7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.875811 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn" (OuterVolumeSpecName: "kube-api-access-6vskn") pod "acb0ebac-7f1a-4d6d-8aba-499a52973d7c" (UID: "acb0ebac-7f1a-4d6d-8aba-499a52973d7c"). InnerVolumeSpecName "kube-api-access-6vskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.927025 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acb0ebac-7f1a-4d6d-8aba-499a52973d7c" (UID: "acb0ebac-7f1a-4d6d-8aba-499a52973d7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.961583 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.961629 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:56 crc kubenswrapper[4912]: I0318 13:38:56.961643 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vskn\" (UniqueName: \"kubernetes.io/projected/acb0ebac-7f1a-4d6d-8aba-499a52973d7c-kube-api-access-6vskn\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.178463 4912 generic.go:334] "Generic (PLEG): container finished" podID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerID="9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41" exitCode=0 Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.178516 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerDied","Data":"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41"} Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.178531 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qkgw" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.178547 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qkgw" event={"ID":"acb0ebac-7f1a-4d6d-8aba-499a52973d7c","Type":"ContainerDied","Data":"801620b8f955c39c06483e50a5214ebce1c040836d225c271e088e8bdffcb30c"} Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.178568 4912 scope.go:117] "RemoveContainer" containerID="9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.209156 4912 scope.go:117] "RemoveContainer" containerID="86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.243207 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.256678 4912 scope.go:117] "RemoveContainer" containerID="024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.261306 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qkgw"] Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.359242 4912 scope.go:117] "RemoveContainer" containerID="9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41" Mar 18 13:38:57 crc kubenswrapper[4912]: E0318 13:38:57.366778 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41\": container with ID starting with 9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41 not found: ID does not exist" containerID="9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.366837 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41"} err="failed to get container status \"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41\": rpc error: code = NotFound desc = could not find container \"9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41\": container with ID starting with 9c501dd38390858ae34d6152a7af7d0c1a2e68bbd0004a0949d8bf49a928bd41 not found: ID does not exist" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.366869 4912 scope.go:117] "RemoveContainer" containerID="86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1" Mar 18 13:38:57 crc kubenswrapper[4912]: E0318 13:38:57.367408 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1\": container with ID starting with 86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1 not found: ID does not exist" containerID="86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.367435 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1"} err="failed to get container status \"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1\": rpc error: code = NotFound desc = could not find container \"86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1\": container with ID starting with 86473a5470ad43a48454d61299c8fd42e21e04ebd1f58d963606ea759eae27a1 not found: ID does not exist" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.367448 4912 scope.go:117] "RemoveContainer" containerID="024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61" Mar 18 13:38:57 crc kubenswrapper[4912]: E0318 13:38:57.368950 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61\": container with ID starting with 024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61 not found: ID does not exist" containerID="024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61" Mar 18 13:38:57 crc kubenswrapper[4912]: I0318 13:38:57.368973 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61"} err="failed to get container status \"024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61\": rpc error: code = NotFound desc = could not find container \"024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61\": container with ID starting with 024adaa2c6a9b9fe7e2ba36ceff9949e9bf86016608a492e9e8f172d76107a61 not found: ID does not exist" Mar 18 13:38:58 crc kubenswrapper[4912]: I0318 13:38:58.244480 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" path="/var/lib/kubelet/pods/acb0ebac-7f1a-4d6d-8aba-499a52973d7c/volumes" Mar 18 13:38:59 crc kubenswrapper[4912]: I0318 13:38:59.066189 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2c6pp"] Mar 18 13:38:59 crc kubenswrapper[4912]: I0318 13:38:59.086175 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2c6pp"] Mar 18 13:39:00 crc kubenswrapper[4912]: I0318 13:39:00.249733 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99566935-653a-45d0-94fb-84e8e27435f9" path="/var/lib/kubelet/pods/99566935-653a-45d0-94fb-84e8e27435f9/volumes" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.426990 4912 scope.go:117] "RemoveContainer" containerID="419380edd8e17c20b3558a4a9dd04e037afedfeac12420ab2d0f189658a84897" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.495094 4912 scope.go:117] "RemoveContainer" containerID="1e2416fab11d3ffe5be436544ad5826d6ece7509a24a2343bf598e243dfca294" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.548345 4912 scope.go:117] "RemoveContainer" containerID="e3aade8978425c41d2d64020c6223165d8a91513fcc8254ddf345c604fc65a17" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.661499 4912 scope.go:117] "RemoveContainer" containerID="edd91f2375d121cb13ef2d7ab1ae9b08383f08124083958323e19244892bae7f" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.727604 4912 scope.go:117] "RemoveContainer" containerID="58e70d147050f31eaf9672b997b4566c4df939fd9d9b936940889fab2e4b6faa" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.783631 4912 scope.go:117] "RemoveContainer" containerID="a5f37e41a59cddebc7ea3a8d97862a13f1897831c952b7dc81090e18672b682a" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.831683 4912 scope.go:117] "RemoveContainer" containerID="85e163b60ca91061e5b8642afb4b05df30eea078a0dc7303cc5d26b06a7e91e5" Mar 18 13:39:01 crc kubenswrapper[4912]: I0318 13:39:01.867778 4912 scope.go:117] "RemoveContainer" containerID="076a1e44f753d42e5cd16f4c0204df3f7e47f2149a494e94c9f1932723b2c2fb" Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.044330 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-abab-account-create-update-zlj6m"] Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.060831 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-b97r9"] Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.074779 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-b97r9"] Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.089830 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-abab-account-create-update-zlj6m"] Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.436152 4912 generic.go:334] "Generic (PLEG): container finished" podID="da62fce5-ea10-4763-b58f-81932668abee" containerID="35eba6a1743cc1467916e42292b5964b79ea8ba2265511482dc5e1338e43ae7f" exitCode=0 Mar 18 13:39:15 crc kubenswrapper[4912]: I0318 13:39:15.436277 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" event={"ID":"da62fce5-ea10-4763-b58f-81932668abee","Type":"ContainerDied","Data":"35eba6a1743cc1467916e42292b5964b79ea8ba2265511482dc5e1338e43ae7f"} Mar 18 13:39:16 crc kubenswrapper[4912]: I0318 13:39:16.242785 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab76845a-bb55-4956-9bc9-4066fe9d6d0f" path="/var/lib/kubelet/pods/ab76845a-bb55-4956-9bc9-4066fe9d6d0f/volumes" Mar 18 13:39:16 crc kubenswrapper[4912]: I0318 13:39:16.244114 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c77729-b8c8-4973-bd84-43b29765e681" path="/var/lib/kubelet/pods/c4c77729-b8c8-4973-bd84-43b29765e681/volumes" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.016219 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.092177 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam\") pod \"da62fce5-ea10-4763-b58f-81932668abee\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.092856 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory\") pod \"da62fce5-ea10-4763-b58f-81932668abee\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.093121 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rg6m\" (UniqueName: \"kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m\") pod \"da62fce5-ea10-4763-b58f-81932668abee\" (UID: \"da62fce5-ea10-4763-b58f-81932668abee\") " Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.100662 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m" (OuterVolumeSpecName: "kube-api-access-7rg6m") pod "da62fce5-ea10-4763-b58f-81932668abee" (UID: "da62fce5-ea10-4763-b58f-81932668abee"). InnerVolumeSpecName "kube-api-access-7rg6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.132424 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da62fce5-ea10-4763-b58f-81932668abee" (UID: "da62fce5-ea10-4763-b58f-81932668abee"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.141748 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory" (OuterVolumeSpecName: "inventory") pod "da62fce5-ea10-4763-b58f-81932668abee" (UID: "da62fce5-ea10-4763-b58f-81932668abee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.196507 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.196550 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da62fce5-ea10-4763-b58f-81932668abee-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.196561 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rg6m\" (UniqueName: \"kubernetes.io/projected/da62fce5-ea10-4763-b58f-81932668abee-kube-api-access-7rg6m\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.466135 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" event={"ID":"da62fce5-ea10-4763-b58f-81932668abee","Type":"ContainerDied","Data":"9652f0e167cedec6a2bba9cf823d38aaed8f21170669baaf3bd7e74aed68076f"} Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.466194 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9652f0e167cedec6a2bba9cf823d38aaed8f21170669baaf3bd7e74aed68076f" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.466280 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.597930 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77"] Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598607 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="extract-content" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598628 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="extract-content" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598641 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598648 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598682 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598692 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598722 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="extract-utilities" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598732 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="extract-utilities" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598746 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62fce5-ea10-4763-b58f-81932668abee" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598754 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62fce5-ea10-4763-b58f-81932668abee" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598776 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="extract-content" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598782 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="extract-content" Mar 18 13:39:17 crc kubenswrapper[4912]: E0318 13:39:17.598796 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="extract-utilities" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.598804 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="extract-utilities" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.599019 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="da62fce5-ea10-4763-b58f-81932668abee" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.599061 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb0ebac-7f1a-4d6d-8aba-499a52973d7c" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.599079 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed401f19-95d4-42b1-9e5b-c5b766277eda" containerName="registry-server" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.600013 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.608937 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.608986 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.609898 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.610176 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.621586 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77"] Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.712088 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwt9\" (UniqueName: \"kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.712226 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.712800 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.816340 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.816477 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwt9\" (UniqueName: \"kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.816559 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.823539 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.827638 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.839174 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwt9\" (UniqueName: \"kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-csz77\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:17 crc kubenswrapper[4912]: I0318 13:39:17.921968 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:18 crc kubenswrapper[4912]: I0318 13:39:18.592221 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77"] Mar 18 13:39:19 crc kubenswrapper[4912]: I0318 13:39:19.490759 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" event={"ID":"b28d5aa0-b546-4057-80d6-04277d6af5e3","Type":"ContainerStarted","Data":"92ab5d0b6f8bef8e9756b794f656c8c570cd326b5e24364d4eaddf30c3e3fd6b"} Mar 18 13:39:20 crc kubenswrapper[4912]: I0318 13:39:20.508309 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" event={"ID":"b28d5aa0-b546-4057-80d6-04277d6af5e3","Type":"ContainerStarted","Data":"db27dde4000d7e34eff9beac4655be52c330cc2d7a945a74c240b6f379e11ea8"} Mar 18 13:39:20 crc kubenswrapper[4912]: I0318 13:39:20.553510 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" podStartSLOduration=3.002187328 podStartE2EDuration="3.553481851s" podCreationTimestamp="2026-03-18 13:39:17 +0000 UTC" firstStartedPulling="2026-03-18 13:39:18.617505141 +0000 UTC m=+2207.076932566" lastFinishedPulling="2026-03-18 13:39:19.168799664 +0000 UTC m=+2207.628227089" observedRunningTime="2026-03-18 13:39:20.53368318 +0000 UTC m=+2208.993110605" watchObservedRunningTime="2026-03-18 13:39:20.553481851 +0000 UTC m=+2209.012909266" Mar 18 13:39:25 crc kubenswrapper[4912]: I0318 13:39:25.574233 4912 generic.go:334] "Generic (PLEG): container finished" podID="b28d5aa0-b546-4057-80d6-04277d6af5e3" containerID="db27dde4000d7e34eff9beac4655be52c330cc2d7a945a74c240b6f379e11ea8" exitCode=0 Mar 18 13:39:25 crc kubenswrapper[4912]: I0318 13:39:25.574355 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" event={"ID":"b28d5aa0-b546-4057-80d6-04277d6af5e3","Type":"ContainerDied","Data":"db27dde4000d7e34eff9beac4655be52c330cc2d7a945a74c240b6f379e11ea8"} Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.129837 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.204712 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdwt9\" (UniqueName: \"kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9\") pod \"b28d5aa0-b546-4057-80d6-04277d6af5e3\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.205245 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory\") pod \"b28d5aa0-b546-4057-80d6-04277d6af5e3\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.205886 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam\") pod \"b28d5aa0-b546-4057-80d6-04277d6af5e3\" (UID: \"b28d5aa0-b546-4057-80d6-04277d6af5e3\") " Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.215402 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9" (OuterVolumeSpecName: "kube-api-access-zdwt9") pod "b28d5aa0-b546-4057-80d6-04277d6af5e3" (UID: "b28d5aa0-b546-4057-80d6-04277d6af5e3"). InnerVolumeSpecName "kube-api-access-zdwt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.237764 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory" (OuterVolumeSpecName: "inventory") pod "b28d5aa0-b546-4057-80d6-04277d6af5e3" (UID: "b28d5aa0-b546-4057-80d6-04277d6af5e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.239297 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b28d5aa0-b546-4057-80d6-04277d6af5e3" (UID: "b28d5aa0-b546-4057-80d6-04277d6af5e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.310807 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.310853 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b28d5aa0-b546-4057-80d6-04277d6af5e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.310871 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdwt9\" (UniqueName: \"kubernetes.io/projected/b28d5aa0-b546-4057-80d6-04277d6af5e3-kube-api-access-zdwt9\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.600380 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" event={"ID":"b28d5aa0-b546-4057-80d6-04277d6af5e3","Type":"ContainerDied","Data":"92ab5d0b6f8bef8e9756b794f656c8c570cd326b5e24364d4eaddf30c3e3fd6b"} Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.600433 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ab5d0b6f8bef8e9756b794f656c8c570cd326b5e24364d4eaddf30c3e3fd6b" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.600463 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-csz77" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.698060 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp"] Mar 18 13:39:27 crc kubenswrapper[4912]: E0318 13:39:27.698612 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28d5aa0-b546-4057-80d6-04277d6af5e3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.698633 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28d5aa0-b546-4057-80d6-04277d6af5e3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.698890 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28d5aa0-b546-4057-80d6-04277d6af5e3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.699824 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.702100 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.705253 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.705391 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.709848 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.719926 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfgp7\" (UniqueName: \"kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.720042 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.719929 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp"] Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.720111 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.822815 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfgp7\" (UniqueName: \"kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.822944 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.822993 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.829645 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.829897 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:27 crc kubenswrapper[4912]: I0318 13:39:27.841656 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfgp7\" (UniqueName: \"kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-62zpp\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:28 crc kubenswrapper[4912]: I0318 13:39:28.030988 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:39:28 crc kubenswrapper[4912]: I0318 13:39:28.865016 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp"] Mar 18 13:39:29 crc kubenswrapper[4912]: I0318 13:39:29.627444 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" event={"ID":"57f0a91c-168e-4a04-a0fa-5d1ea81eea22","Type":"ContainerStarted","Data":"2a3feeabc1386b2c7faab42d4d643ab9f580cfac2d6a0f47dc26b4e1244a0414"} Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.074272 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lkdsg"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.096065 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lkdsg"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.112810 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tqk4j"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.130076 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.133694 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.148674 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tqk4j"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.173111 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.190494 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kft57\" (UniqueName: \"kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.190557 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.190665 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.247304 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0531362c-01f6-463c-8217-e78b33f55630" path="/var/lib/kubelet/pods/0531362c-01f6-463c-8217-e78b33f55630/volumes" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.250095 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3e3ac4-e8a9-473c-96cb-479132a1882d" path="/var/lib/kubelet/pods/7e3e3ac4-e8a9-473c-96cb-479132a1882d/volumes" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.293065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.293256 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kft57\" (UniqueName: \"kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.293293 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.293935 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.293978 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.320146 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kft57\" (UniqueName: \"kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57\") pod \"redhat-operators-knbtx\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.464357 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.660789 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" event={"ID":"57f0a91c-168e-4a04-a0fa-5d1ea81eea22","Type":"ContainerStarted","Data":"fd73994d14d390f77e79ea703b432daea0ad649f91b8c94a99b750cde8701384"} Mar 18 13:39:30 crc kubenswrapper[4912]: I0318 13:39:30.708927 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" podStartSLOduration=2.97744271 podStartE2EDuration="3.7088975s" podCreationTimestamp="2026-03-18 13:39:27 +0000 UTC" firstStartedPulling="2026-03-18 13:39:28.875157505 +0000 UTC m=+2217.334584930" lastFinishedPulling="2026-03-18 13:39:29.606612305 +0000 UTC m=+2218.066039720" observedRunningTime="2026-03-18 13:39:30.685739689 +0000 UTC m=+2219.145167134" watchObservedRunningTime="2026-03-18 13:39:30.7088975 +0000 UTC m=+2219.168324925" Mar 18 13:39:31 crc kubenswrapper[4912]: I0318 13:39:31.019079 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:31 crc kubenswrapper[4912]: W0318 13:39:31.025235 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36547a6_e5af_46e5_8edc_ced02945945a.slice/crio-6a1ccfc57ac1a62a601b0ef97ce5eae3246bc7d7c6b7e121b3faa64dbef5017b WatchSource:0}: Error finding container 6a1ccfc57ac1a62a601b0ef97ce5eae3246bc7d7c6b7e121b3faa64dbef5017b: Status 404 returned error can't find the container with id 6a1ccfc57ac1a62a601b0ef97ce5eae3246bc7d7c6b7e121b3faa64dbef5017b Mar 18 13:39:31 crc kubenswrapper[4912]: I0318 13:39:31.672454 4912 generic.go:334] "Generic (PLEG): container finished" podID="d36547a6-e5af-46e5-8edc-ced02945945a" containerID="7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501" exitCode=0 Mar 18 13:39:31 crc kubenswrapper[4912]: I0318 13:39:31.672539 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerDied","Data":"7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501"} Mar 18 13:39:31 crc kubenswrapper[4912]: I0318 13:39:31.673965 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerStarted","Data":"6a1ccfc57ac1a62a601b0ef97ce5eae3246bc7d7c6b7e121b3faa64dbef5017b"} Mar 18 13:39:33 crc kubenswrapper[4912]: I0318 13:39:33.711280 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerStarted","Data":"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891"} Mar 18 13:39:36 crc kubenswrapper[4912]: I0318 13:39:36.998993 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:39:37 crc kubenswrapper[4912]: I0318 13:39:36.999925 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:39:38 crc kubenswrapper[4912]: I0318 13:39:38.797676 4912 generic.go:334] "Generic (PLEG): container finished" podID="d36547a6-e5af-46e5-8edc-ced02945945a" containerID="42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891" exitCode=0 Mar 18 13:39:38 crc kubenswrapper[4912]: I0318 13:39:38.797748 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerDied","Data":"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891"} Mar 18 13:39:39 crc kubenswrapper[4912]: I0318 13:39:39.812837 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerStarted","Data":"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85"} Mar 18 13:39:39 crc kubenswrapper[4912]: I0318 13:39:39.842915 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knbtx" podStartSLOduration=2.3141798319999998 podStartE2EDuration="9.842879965s" podCreationTimestamp="2026-03-18 13:39:30 +0000 UTC" firstStartedPulling="2026-03-18 13:39:31.67536895 +0000 UTC m=+2220.134796375" lastFinishedPulling="2026-03-18 13:39:39.204069083 +0000 UTC m=+2227.663496508" observedRunningTime="2026-03-18 13:39:39.831955532 +0000 UTC m=+2228.291382977" watchObservedRunningTime="2026-03-18 13:39:39.842879965 +0000 UTC m=+2228.302307390" Mar 18 13:39:40 crc kubenswrapper[4912]: I0318 13:39:40.465784 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:40 crc kubenswrapper[4912]: I0318 13:39:40.466411 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:41 crc kubenswrapper[4912]: I0318 13:39:41.531476 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knbtx" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="registry-server" probeResult="failure" output=< Mar 18 13:39:41 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:39:41 crc kubenswrapper[4912]: > Mar 18 13:39:50 crc kubenswrapper[4912]: I0318 13:39:50.526078 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:50 crc kubenswrapper[4912]: I0318 13:39:50.589236 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:50 crc kubenswrapper[4912]: I0318 13:39:50.783607 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:51 crc kubenswrapper[4912]: I0318 13:39:51.964775 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knbtx" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="registry-server" containerID="cri-o://fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85" gracePeriod=2 Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.565187 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.700944 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kft57\" (UniqueName: \"kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57\") pod \"d36547a6-e5af-46e5-8edc-ced02945945a\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.701501 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content\") pod \"d36547a6-e5af-46e5-8edc-ced02945945a\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.701793 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities\") pod \"d36547a6-e5af-46e5-8edc-ced02945945a\" (UID: \"d36547a6-e5af-46e5-8edc-ced02945945a\") " Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.702721 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities" (OuterVolumeSpecName: "utilities") pod "d36547a6-e5af-46e5-8edc-ced02945945a" (UID: "d36547a6-e5af-46e5-8edc-ced02945945a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.715381 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57" (OuterVolumeSpecName: "kube-api-access-kft57") pod "d36547a6-e5af-46e5-8edc-ced02945945a" (UID: "d36547a6-e5af-46e5-8edc-ced02945945a"). InnerVolumeSpecName "kube-api-access-kft57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.807098 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kft57\" (UniqueName: \"kubernetes.io/projected/d36547a6-e5af-46e5-8edc-ced02945945a-kube-api-access-kft57\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.807157 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.869009 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36547a6-e5af-46e5-8edc-ced02945945a" (UID: "d36547a6-e5af-46e5-8edc-ced02945945a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.910853 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36547a6-e5af-46e5-8edc-ced02945945a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.980673 4912 generic.go:334] "Generic (PLEG): container finished" podID="d36547a6-e5af-46e5-8edc-ced02945945a" containerID="fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85" exitCode=0 Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.980730 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerDied","Data":"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85"} Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.980766 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knbtx" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.980794 4912 scope.go:117] "RemoveContainer" containerID="fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85" Mar 18 13:39:52 crc kubenswrapper[4912]: I0318 13:39:52.980777 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knbtx" event={"ID":"d36547a6-e5af-46e5-8edc-ced02945945a","Type":"ContainerDied","Data":"6a1ccfc57ac1a62a601b0ef97ce5eae3246bc7d7c6b7e121b3faa64dbef5017b"} Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.023773 4912 scope.go:117] "RemoveContainer" containerID="42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.033934 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.047126 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knbtx"] Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.079870 4912 scope.go:117] "RemoveContainer" containerID="7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.128274 4912 scope.go:117] "RemoveContainer" containerID="fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85" Mar 18 13:39:53 crc kubenswrapper[4912]: E0318 13:39:53.129010 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85\": container with ID starting with fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85 not found: ID does not exist" containerID="fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.129076 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85"} err="failed to get container status \"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85\": rpc error: code = NotFound desc = could not find container \"fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85\": container with ID starting with fd248e095b5c0b1e8cd8844652009de0509da9a3e9f1c894677135019eb4fd85 not found: ID does not exist" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.129109 4912 scope.go:117] "RemoveContainer" containerID="42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891" Mar 18 13:39:53 crc kubenswrapper[4912]: E0318 13:39:53.129777 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891\": container with ID starting with 42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891 not found: ID does not exist" containerID="42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.129830 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891"} err="failed to get container status \"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891\": rpc error: code = NotFound desc = could not find container \"42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891\": container with ID starting with 42a3bca2772d6fbed49782eda3c75cb6800a015715532670b366443248342891 not found: ID does not exist" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.129852 4912 scope.go:117] "RemoveContainer" containerID="7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501" Mar 18 13:39:53 crc kubenswrapper[4912]: E0318 13:39:53.130474 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501\": container with ID starting with 7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501 not found: ID does not exist" containerID="7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501" Mar 18 13:39:53 crc kubenswrapper[4912]: I0318 13:39:53.130561 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501"} err="failed to get container status \"7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501\": rpc error: code = NotFound desc = could not find container \"7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501\": container with ID starting with 7085600e860d233e1ad9bfa2a17b2e08c45bbf2bf4282f865b465e93a9182501 not found: ID does not exist" Mar 18 13:39:54 crc kubenswrapper[4912]: I0318 13:39:54.244273 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" path="/var/lib/kubelet/pods/d36547a6-e5af-46e5-8edc-ced02945945a/volumes" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.234078 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564020-fjsrf"] Mar 18 13:40:00 crc kubenswrapper[4912]: E0318 13:40:00.235193 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="registry-server" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.235208 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="registry-server" Mar 18 13:40:00 crc kubenswrapper[4912]: E0318 13:40:00.235249 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="extract-utilities" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.235256 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="extract-utilities" Mar 18 13:40:00 crc kubenswrapper[4912]: E0318 13:40:00.235286 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="extract-content" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.235292 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="extract-content" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.235520 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36547a6-e5af-46e5-8edc-ced02945945a" containerName="registry-server" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.236697 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.242461 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.243149 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.243364 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.264015 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-fjsrf"] Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.340819 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hnf\" (UniqueName: \"kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf\") pod \"auto-csr-approver-29564020-fjsrf\" (UID: \"264a6d26-eab2-4104-b0b4-bdd17eb770ed\") " pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.442977 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hnf\" (UniqueName: \"kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf\") pod \"auto-csr-approver-29564020-fjsrf\" (UID: \"264a6d26-eab2-4104-b0b4-bdd17eb770ed\") " pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.466658 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hnf\" (UniqueName: \"kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf\") pod \"auto-csr-approver-29564020-fjsrf\" (UID: \"264a6d26-eab2-4104-b0b4-bdd17eb770ed\") " pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:00 crc kubenswrapper[4912]: I0318 13:40:00.573924 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:01 crc kubenswrapper[4912]: I0318 13:40:01.115775 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-fjsrf"] Mar 18 13:40:02 crc kubenswrapper[4912]: I0318 13:40:02.090081 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" event={"ID":"264a6d26-eab2-4104-b0b4-bdd17eb770ed","Type":"ContainerStarted","Data":"773f1bfef88d76facd09be8a53eb64d87e8f7961b70be680698dbe0bc3438f30"} Mar 18 13:40:02 crc kubenswrapper[4912]: I0318 13:40:02.246070 4912 scope.go:117] "RemoveContainer" containerID="f731d42d55b0d10c0a20f3b599df01cc52356b3edbab12131b34da069e7e683a" Mar 18 13:40:02 crc kubenswrapper[4912]: I0318 13:40:02.291299 4912 scope.go:117] "RemoveContainer" containerID="cf6f8672081af8c8ecf267d12292a6b48a19daaa698d77f80325111109309621" Mar 18 13:40:02 crc kubenswrapper[4912]: I0318 13:40:02.397111 4912 scope.go:117] "RemoveContainer" containerID="b80e16972749a07a0dc993e0d46563eb439416c41a6a1a9335cb4aaeb8f4fd8a" Mar 18 13:40:02 crc kubenswrapper[4912]: I0318 13:40:02.461521 4912 scope.go:117] "RemoveContainer" containerID="a17a15307628daee1060fc12806a4ac94691a1855194e5d949370fc0ecd59cd1" Mar 18 13:40:03 crc kubenswrapper[4912]: I0318 13:40:03.106748 4912 generic.go:334] "Generic (PLEG): container finished" podID="264a6d26-eab2-4104-b0b4-bdd17eb770ed" containerID="d0a918a68f10e1d6549125370736916a44135734ea9852d673c60819f11a4744" exitCode=0 Mar 18 13:40:03 crc kubenswrapper[4912]: I0318 13:40:03.107085 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" event={"ID":"264a6d26-eab2-4104-b0b4-bdd17eb770ed","Type":"ContainerDied","Data":"d0a918a68f10e1d6549125370736916a44135734ea9852d673c60819f11a4744"} Mar 18 13:40:04 crc kubenswrapper[4912]: I0318 13:40:04.620996 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:04 crc kubenswrapper[4912]: I0318 13:40:04.707993 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7hnf\" (UniqueName: \"kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf\") pod \"264a6d26-eab2-4104-b0b4-bdd17eb770ed\" (UID: \"264a6d26-eab2-4104-b0b4-bdd17eb770ed\") " Mar 18 13:40:04 crc kubenswrapper[4912]: I0318 13:40:04.721261 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf" (OuterVolumeSpecName: "kube-api-access-t7hnf") pod "264a6d26-eab2-4104-b0b4-bdd17eb770ed" (UID: "264a6d26-eab2-4104-b0b4-bdd17eb770ed"). InnerVolumeSpecName "kube-api-access-t7hnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:04 crc kubenswrapper[4912]: I0318 13:40:04.823949 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7hnf\" (UniqueName: \"kubernetes.io/projected/264a6d26-eab2-4104-b0b4-bdd17eb770ed-kube-api-access-t7hnf\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:05 crc kubenswrapper[4912]: I0318 13:40:05.137713 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" event={"ID":"264a6d26-eab2-4104-b0b4-bdd17eb770ed","Type":"ContainerDied","Data":"773f1bfef88d76facd09be8a53eb64d87e8f7961b70be680698dbe0bc3438f30"} Mar 18 13:40:05 crc kubenswrapper[4912]: I0318 13:40:05.137761 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773f1bfef88d76facd09be8a53eb64d87e8f7961b70be680698dbe0bc3438f30" Mar 18 13:40:05 crc kubenswrapper[4912]: I0318 13:40:05.137783 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-fjsrf" Mar 18 13:40:05 crc kubenswrapper[4912]: I0318 13:40:05.692518 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-gz98k"] Mar 18 13:40:05 crc kubenswrapper[4912]: I0318 13:40:05.704667 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-gz98k"] Mar 18 13:40:06 crc kubenswrapper[4912]: I0318 13:40:06.245013 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20507bb-693e-4deb-b781-b1358d0c9871" path="/var/lib/kubelet/pods/e20507bb-693e-4deb-b781-b1358d0c9871/volumes" Mar 18 13:40:06 crc kubenswrapper[4912]: I0318 13:40:06.998699 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:40:07 crc kubenswrapper[4912]: I0318 13:40:07.000109 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:40:07 crc kubenswrapper[4912]: I0318 13:40:07.160270 4912 generic.go:334] "Generic (PLEG): container finished" podID="57f0a91c-168e-4a04-a0fa-5d1ea81eea22" containerID="fd73994d14d390f77e79ea703b432daea0ad649f91b8c94a99b750cde8701384" exitCode=0 Mar 18 13:40:07 crc kubenswrapper[4912]: I0318 13:40:07.160338 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" event={"ID":"57f0a91c-168e-4a04-a0fa-5d1ea81eea22","Type":"ContainerDied","Data":"fd73994d14d390f77e79ea703b432daea0ad649f91b8c94a99b750cde8701384"} Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.793650 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.857332 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory\") pod \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.857513 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfgp7\" (UniqueName: \"kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7\") pod \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.857719 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam\") pod \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\" (UID: \"57f0a91c-168e-4a04-a0fa-5d1ea81eea22\") " Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.876154 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7" (OuterVolumeSpecName: "kube-api-access-cfgp7") pod "57f0a91c-168e-4a04-a0fa-5d1ea81eea22" (UID: "57f0a91c-168e-4a04-a0fa-5d1ea81eea22"). InnerVolumeSpecName "kube-api-access-cfgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.896722 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57f0a91c-168e-4a04-a0fa-5d1ea81eea22" (UID: "57f0a91c-168e-4a04-a0fa-5d1ea81eea22"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.903975 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory" (OuterVolumeSpecName: "inventory") pod "57f0a91c-168e-4a04-a0fa-5d1ea81eea22" (UID: "57f0a91c-168e-4a04-a0fa-5d1ea81eea22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.961485 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.961533 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfgp7\" (UniqueName: \"kubernetes.io/projected/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-kube-api-access-cfgp7\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:08 crc kubenswrapper[4912]: I0318 13:40:08.961554 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f0a91c-168e-4a04-a0fa-5d1ea81eea22-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.211148 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" event={"ID":"57f0a91c-168e-4a04-a0fa-5d1ea81eea22","Type":"ContainerDied","Data":"2a3feeabc1386b2c7faab42d4d643ab9f580cfac2d6a0f47dc26b4e1244a0414"} Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.211200 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-62zpp" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.211217 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3feeabc1386b2c7faab42d4d643ab9f580cfac2d6a0f47dc26b4e1244a0414" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.298196 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h"] Mar 18 13:40:09 crc kubenswrapper[4912]: E0318 13:40:09.299376 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f0a91c-168e-4a04-a0fa-5d1ea81eea22" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.299405 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f0a91c-168e-4a04-a0fa-5d1ea81eea22" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:09 crc kubenswrapper[4912]: E0318 13:40:09.299470 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264a6d26-eab2-4104-b0b4-bdd17eb770ed" containerName="oc" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.299481 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="264a6d26-eab2-4104-b0b4-bdd17eb770ed" containerName="oc" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.299796 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f0a91c-168e-4a04-a0fa-5d1ea81eea22" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.299825 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="264a6d26-eab2-4104-b0b4-bdd17eb770ed" containerName="oc" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.301079 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.306117 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.306712 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.306725 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.307002 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.321288 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h"] Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.372340 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.372535 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v696z\" (UniqueName: \"kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.372594 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.476114 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v696z\" (UniqueName: \"kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.476426 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.477071 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.481224 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.489735 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.495091 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v696z\" (UniqueName: \"kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:09 crc kubenswrapper[4912]: I0318 13:40:09.622432 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:10 crc kubenswrapper[4912]: I0318 13:40:10.202594 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h"] Mar 18 13:40:10 crc kubenswrapper[4912]: I0318 13:40:10.226828 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" event={"ID":"e89a9418-20a0-4b00-9012-5c17c43b7170","Type":"ContainerStarted","Data":"61c5f8783938c1db78a0a7fdcdf1a08b2e4ff2f3586843dc3a6de339b19d14f8"} Mar 18 13:40:11 crc kubenswrapper[4912]: I0318 13:40:11.242232 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" event={"ID":"e89a9418-20a0-4b00-9012-5c17c43b7170","Type":"ContainerStarted","Data":"d0f5e20590af0a7fc297727dc4e95ac4609da1fd56735a7911e91e3f64c4457e"} Mar 18 13:40:11 crc kubenswrapper[4912]: I0318 13:40:11.269213 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" podStartSLOduration=1.761668623 podStartE2EDuration="2.26918144s" podCreationTimestamp="2026-03-18 13:40:09 +0000 UTC" firstStartedPulling="2026-03-18 13:40:10.21338174 +0000 UTC m=+2258.672809165" lastFinishedPulling="2026-03-18 13:40:10.720894557 +0000 UTC m=+2259.180321982" observedRunningTime="2026-03-18 13:40:11.262527811 +0000 UTC m=+2259.721955246" watchObservedRunningTime="2026-03-18 13:40:11.26918144 +0000 UTC m=+2259.728608865" Mar 18 13:40:13 crc kubenswrapper[4912]: I0318 13:40:13.043244 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6qzs"] Mar 18 13:40:13 crc kubenswrapper[4912]: I0318 13:40:13.053716 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6qzs"] Mar 18 13:40:14 crc kubenswrapper[4912]: I0318 13:40:14.246923 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77126f44-41a8-416e-b198-fd0242a64bb9" path="/var/lib/kubelet/pods/77126f44-41a8-416e-b198-fd0242a64bb9/volumes" Mar 18 13:40:36 crc kubenswrapper[4912]: I0318 13:40:36.998995 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:40:36 crc kubenswrapper[4912]: I0318 13:40:36.999889 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:36.999951 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:37.001108 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:37.001163 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21" gracePeriod=600 Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:37.408169 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21" exitCode=0 Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:37.408580 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21"} Mar 18 13:40:37 crc kubenswrapper[4912]: I0318 13:40:37.408629 4912 scope.go:117] "RemoveContainer" containerID="f0ce693a4c723b7e7ba4bae69ee1323b2ecedc810f38565eedd7ec2b967e6cf6" Mar 18 13:40:38 crc kubenswrapper[4912]: I0318 13:40:38.423341 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753"} Mar 18 13:40:55 crc kubenswrapper[4912]: I0318 13:40:55.633869 4912 generic.go:334] "Generic (PLEG): container finished" podID="e89a9418-20a0-4b00-9012-5c17c43b7170" containerID="d0f5e20590af0a7fc297727dc4e95ac4609da1fd56735a7911e91e3f64c4457e" exitCode=0 Mar 18 13:40:55 crc kubenswrapper[4912]: I0318 13:40:55.633949 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" event={"ID":"e89a9418-20a0-4b00-9012-5c17c43b7170","Type":"ContainerDied","Data":"d0f5e20590af0a7fc297727dc4e95ac4609da1fd56735a7911e91e3f64c4457e"} Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.211417 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.401084 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam\") pod \"e89a9418-20a0-4b00-9012-5c17c43b7170\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.401139 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v696z\" (UniqueName: \"kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z\") pod \"e89a9418-20a0-4b00-9012-5c17c43b7170\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.401204 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory\") pod \"e89a9418-20a0-4b00-9012-5c17c43b7170\" (UID: \"e89a9418-20a0-4b00-9012-5c17c43b7170\") " Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.408650 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z" (OuterVolumeSpecName: "kube-api-access-v696z") pod "e89a9418-20a0-4b00-9012-5c17c43b7170" (UID: "e89a9418-20a0-4b00-9012-5c17c43b7170"). InnerVolumeSpecName "kube-api-access-v696z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.437584 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e89a9418-20a0-4b00-9012-5c17c43b7170" (UID: "e89a9418-20a0-4b00-9012-5c17c43b7170"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.438841 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory" (OuterVolumeSpecName: "inventory") pod "e89a9418-20a0-4b00-9012-5c17c43b7170" (UID: "e89a9418-20a0-4b00-9012-5c17c43b7170"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.504640 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.505098 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v696z\" (UniqueName: \"kubernetes.io/projected/e89a9418-20a0-4b00-9012-5c17c43b7170-kube-api-access-v696z\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.505112 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89a9418-20a0-4b00-9012-5c17c43b7170-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.663267 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" event={"ID":"e89a9418-20a0-4b00-9012-5c17c43b7170","Type":"ContainerDied","Data":"61c5f8783938c1db78a0a7fdcdf1a08b2e4ff2f3586843dc3a6de339b19d14f8"} Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.663323 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61c5f8783938c1db78a0a7fdcdf1a08b2e4ff2f3586843dc3a6de339b19d14f8" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.663414 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.766690 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sjjdz"] Mar 18 13:40:57 crc kubenswrapper[4912]: E0318 13:40:57.767934 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a9418-20a0-4b00-9012-5c17c43b7170" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.767965 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a9418-20a0-4b00-9012-5c17c43b7170" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.768367 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a9418-20a0-4b00-9012-5c17c43b7170" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.769626 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.772344 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.772443 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.772579 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.772644 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.809444 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sjjdz"] Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.816655 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.816735 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.817011 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4rn\" (UniqueName: \"kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.919718 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.919785 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.919859 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4rn\" (UniqueName: \"kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.925529 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.925577 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:57 crc kubenswrapper[4912]: I0318 13:40:57.953735 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4rn\" (UniqueName: \"kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn\") pod \"ssh-known-hosts-edpm-deployment-sjjdz\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:58 crc kubenswrapper[4912]: I0318 13:40:58.101603 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:40:58 crc kubenswrapper[4912]: I0318 13:40:58.685333 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sjjdz"] Mar 18 13:40:59 crc kubenswrapper[4912]: I0318 13:40:59.687096 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" event={"ID":"725fe434-fed8-4c32-b6a7-8f320dd6e0fd","Type":"ContainerStarted","Data":"4c16e1d07d1b350b4155e4139502774f837c4d0abce7e27f1097c0691a6c10e8"} Mar 18 13:41:00 crc kubenswrapper[4912]: I0318 13:41:00.699082 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" event={"ID":"725fe434-fed8-4c32-b6a7-8f320dd6e0fd","Type":"ContainerStarted","Data":"778e08de4b6759b7f6df290e9e34d87482fe7f8c7963bb5a27bc4abc3259b20a"} Mar 18 13:41:00 crc kubenswrapper[4912]: I0318 13:41:00.724280 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" podStartSLOduration=2.987477229 podStartE2EDuration="3.724246611s" podCreationTimestamp="2026-03-18 13:40:57 +0000 UTC" firstStartedPulling="2026-03-18 13:40:58.694231936 +0000 UTC m=+2307.153659361" lastFinishedPulling="2026-03-18 13:40:59.431001318 +0000 UTC m=+2307.890428743" observedRunningTime="2026-03-18 13:41:00.715419244 +0000 UTC m=+2309.174846679" watchObservedRunningTime="2026-03-18 13:41:00.724246611 +0000 UTC m=+2309.183674036" Mar 18 13:41:02 crc kubenswrapper[4912]: I0318 13:41:02.704245 4912 scope.go:117] "RemoveContainer" containerID="628b8ab9fc3c79747a5a94520aec930f980fa9a8222a978992336e29d4e52e87" Mar 18 13:41:02 crc kubenswrapper[4912]: I0318 13:41:02.763745 4912 scope.go:117] "RemoveContainer" containerID="5e2609728611dffb36be40cc38777d196d9cb3d47a8be3589569abd966d3fb8d" Mar 18 13:41:06 crc kubenswrapper[4912]: I0318 13:41:06.781493 4912 generic.go:334] "Generic (PLEG): container finished" podID="725fe434-fed8-4c32-b6a7-8f320dd6e0fd" containerID="778e08de4b6759b7f6df290e9e34d87482fe7f8c7963bb5a27bc4abc3259b20a" exitCode=0 Mar 18 13:41:06 crc kubenswrapper[4912]: I0318 13:41:06.781625 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" event={"ID":"725fe434-fed8-4c32-b6a7-8f320dd6e0fd","Type":"ContainerDied","Data":"778e08de4b6759b7f6df290e9e34d87482fe7f8c7963bb5a27bc4abc3259b20a"} Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.438611 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.468826 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0\") pod \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.469005 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4rn\" (UniqueName: \"kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn\") pod \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.469118 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam\") pod \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\" (UID: \"725fe434-fed8-4c32-b6a7-8f320dd6e0fd\") " Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.476839 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn" (OuterVolumeSpecName: "kube-api-access-hn4rn") pod "725fe434-fed8-4c32-b6a7-8f320dd6e0fd" (UID: "725fe434-fed8-4c32-b6a7-8f320dd6e0fd"). InnerVolumeSpecName "kube-api-access-hn4rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.508289 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "725fe434-fed8-4c32-b6a7-8f320dd6e0fd" (UID: "725fe434-fed8-4c32-b6a7-8f320dd6e0fd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.514800 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "725fe434-fed8-4c32-b6a7-8f320dd6e0fd" (UID: "725fe434-fed8-4c32-b6a7-8f320dd6e0fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.572802 4912 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.572850 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4rn\" (UniqueName: \"kubernetes.io/projected/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-kube-api-access-hn4rn\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.572867 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/725fe434-fed8-4c32-b6a7-8f320dd6e0fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.817083 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" event={"ID":"725fe434-fed8-4c32-b6a7-8f320dd6e0fd","Type":"ContainerDied","Data":"4c16e1d07d1b350b4155e4139502774f837c4d0abce7e27f1097c0691a6c10e8"} Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.817152 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c16e1d07d1b350b4155e4139502774f837c4d0abce7e27f1097c0691a6c10e8" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.817186 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sjjdz" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.978613 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4"] Mar 18 13:41:08 crc kubenswrapper[4912]: E0318 13:41:08.979184 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725fe434-fed8-4c32-b6a7-8f320dd6e0fd" containerName="ssh-known-hosts-edpm-deployment" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.979203 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="725fe434-fed8-4c32-b6a7-8f320dd6e0fd" containerName="ssh-known-hosts-edpm-deployment" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.979617 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="725fe434-fed8-4c32-b6a7-8f320dd6e0fd" containerName="ssh-known-hosts-edpm-deployment" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.980646 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.986108 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.986140 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.986495 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:41:08 crc kubenswrapper[4912]: I0318 13:41:08.986494 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.010396 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkm7\" (UniqueName: \"kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.010981 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.011500 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.013201 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4"] Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.114383 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.114864 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkm7\" (UniqueName: \"kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.114995 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.120004 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.120233 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.142735 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkm7\" (UniqueName: \"kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-54hl4\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.306523 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:09 crc kubenswrapper[4912]: I0318 13:41:09.910240 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4"] Mar 18 13:41:10 crc kubenswrapper[4912]: I0318 13:41:10.843812 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" event={"ID":"3fc8fc8f-f538-4240-a174-5144a5592e75","Type":"ContainerStarted","Data":"02033636173b1e3f373755ba552d70949c6c99745cc29166705b19887ff41236"} Mar 18 13:41:10 crc kubenswrapper[4912]: I0318 13:41:10.844649 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" event={"ID":"3fc8fc8f-f538-4240-a174-5144a5592e75","Type":"ContainerStarted","Data":"e400674ccf59603432d10a6f2732b4344b15e628ff1ee0e8f75ab49380230215"} Mar 18 13:41:10 crc kubenswrapper[4912]: I0318 13:41:10.874684 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" podStartSLOduration=2.453451067 podStartE2EDuration="2.874657607s" podCreationTimestamp="2026-03-18 13:41:08 +0000 UTC" firstStartedPulling="2026-03-18 13:41:09.941680286 +0000 UTC m=+2318.401107701" lastFinishedPulling="2026-03-18 13:41:10.362886816 +0000 UTC m=+2318.822314241" observedRunningTime="2026-03-18 13:41:10.863470596 +0000 UTC m=+2319.322898031" watchObservedRunningTime="2026-03-18 13:41:10.874657607 +0000 UTC m=+2319.334085052" Mar 18 13:41:18 crc kubenswrapper[4912]: I0318 13:41:18.936603 4912 generic.go:334] "Generic (PLEG): container finished" podID="3fc8fc8f-f538-4240-a174-5144a5592e75" containerID="02033636173b1e3f373755ba552d70949c6c99745cc29166705b19887ff41236" exitCode=0 Mar 18 13:41:18 crc kubenswrapper[4912]: I0318 13:41:18.936696 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" event={"ID":"3fc8fc8f-f538-4240-a174-5144a5592e75","Type":"ContainerDied","Data":"02033636173b1e3f373755ba552d70949c6c99745cc29166705b19887ff41236"} Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.509317 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.625104 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkm7\" (UniqueName: \"kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7\") pod \"3fc8fc8f-f538-4240-a174-5144a5592e75\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.625442 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory\") pod \"3fc8fc8f-f538-4240-a174-5144a5592e75\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.625798 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam\") pod \"3fc8fc8f-f538-4240-a174-5144a5592e75\" (UID: \"3fc8fc8f-f538-4240-a174-5144a5592e75\") " Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.638416 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7" (OuterVolumeSpecName: "kube-api-access-crkm7") pod "3fc8fc8f-f538-4240-a174-5144a5592e75" (UID: "3fc8fc8f-f538-4240-a174-5144a5592e75"). InnerVolumeSpecName "kube-api-access-crkm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.664862 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory" (OuterVolumeSpecName: "inventory") pod "3fc8fc8f-f538-4240-a174-5144a5592e75" (UID: "3fc8fc8f-f538-4240-a174-5144a5592e75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.666935 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3fc8fc8f-f538-4240-a174-5144a5592e75" (UID: "3fc8fc8f-f538-4240-a174-5144a5592e75"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.729381 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.729444 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc8fc8f-f538-4240-a174-5144a5592e75-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.729460 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkm7\" (UniqueName: \"kubernetes.io/projected/3fc8fc8f-f538-4240-a174-5144a5592e75-kube-api-access-crkm7\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.964695 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" event={"ID":"3fc8fc8f-f538-4240-a174-5144a5592e75","Type":"ContainerDied","Data":"e400674ccf59603432d10a6f2732b4344b15e628ff1ee0e8f75ab49380230215"} Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.965255 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e400674ccf59603432d10a6f2732b4344b15e628ff1ee0e8f75ab49380230215" Mar 18 13:41:20 crc kubenswrapper[4912]: I0318 13:41:20.964821 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-54hl4" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.056634 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd"] Mar 18 13:41:21 crc kubenswrapper[4912]: E0318 13:41:21.057476 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc8fc8f-f538-4240-a174-5144a5592e75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.057506 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc8fc8f-f538-4240-a174-5144a5592e75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.057807 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc8fc8f-f538-4240-a174-5144a5592e75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.059053 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.064446 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.064821 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.064973 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.065188 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.069806 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd"] Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.244472 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.245133 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.246118 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s84rr\" (UniqueName: \"kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.349519 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.349608 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.349731 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s84rr\" (UniqueName: \"kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.355146 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.360587 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.383228 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s84rr\" (UniqueName: \"kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.393394 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.947833 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd"] Mar 18 13:41:21 crc kubenswrapper[4912]: I0318 13:41:21.976775 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" event={"ID":"560fc636-c082-4462-935f-1323ed49eef4","Type":"ContainerStarted","Data":"58e8fee168bd2bf22daf3bb0a4c4f5241667d3834f85038878a9ead2bb58f359"} Mar 18 13:41:22 crc kubenswrapper[4912]: I0318 13:41:22.996787 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" event={"ID":"560fc636-c082-4462-935f-1323ed49eef4","Type":"ContainerStarted","Data":"8482abfd00851ae92c0970b8da7307cf50da1e3ad3cd990b45d767180ecdd965"} Mar 18 13:41:23 crc kubenswrapper[4912]: I0318 13:41:23.030496 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" podStartSLOduration=1.610369325 podStartE2EDuration="2.030413243s" podCreationTimestamp="2026-03-18 13:41:21 +0000 UTC" firstStartedPulling="2026-03-18 13:41:21.954978608 +0000 UTC m=+2330.414406033" lastFinishedPulling="2026-03-18 13:41:22.375022526 +0000 UTC m=+2330.834449951" observedRunningTime="2026-03-18 13:41:23.017895957 +0000 UTC m=+2331.477323392" watchObservedRunningTime="2026-03-18 13:41:23.030413243 +0000 UTC m=+2331.489840688" Mar 18 13:41:32 crc kubenswrapper[4912]: I0318 13:41:32.228389 4912 generic.go:334] "Generic (PLEG): container finished" podID="560fc636-c082-4462-935f-1323ed49eef4" containerID="8482abfd00851ae92c0970b8da7307cf50da1e3ad3cd990b45d767180ecdd965" exitCode=0 Mar 18 13:41:32 crc kubenswrapper[4912]: I0318 13:41:32.260322 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" event={"ID":"560fc636-c082-4462-935f-1323ed49eef4","Type":"ContainerDied","Data":"8482abfd00851ae92c0970b8da7307cf50da1e3ad3cd990b45d767180ecdd965"} Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.679397 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.719860 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s84rr\" (UniqueName: \"kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr\") pod \"560fc636-c082-4462-935f-1323ed49eef4\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.720672 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam\") pod \"560fc636-c082-4462-935f-1323ed49eef4\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.720752 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory\") pod \"560fc636-c082-4462-935f-1323ed49eef4\" (UID: \"560fc636-c082-4462-935f-1323ed49eef4\") " Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.746542 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr" (OuterVolumeSpecName: "kube-api-access-s84rr") pod "560fc636-c082-4462-935f-1323ed49eef4" (UID: "560fc636-c082-4462-935f-1323ed49eef4"). InnerVolumeSpecName "kube-api-access-s84rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.783314 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "560fc636-c082-4462-935f-1323ed49eef4" (UID: "560fc636-c082-4462-935f-1323ed49eef4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.798204 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory" (OuterVolumeSpecName: "inventory") pod "560fc636-c082-4462-935f-1323ed49eef4" (UID: "560fc636-c082-4462-935f-1323ed49eef4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.825984 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s84rr\" (UniqueName: \"kubernetes.io/projected/560fc636-c082-4462-935f-1323ed49eef4-kube-api-access-s84rr\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.826028 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:33 crc kubenswrapper[4912]: I0318 13:41:33.826057 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560fc636-c082-4462-935f-1323ed49eef4-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.262998 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" event={"ID":"560fc636-c082-4462-935f-1323ed49eef4","Type":"ContainerDied","Data":"58e8fee168bd2bf22daf3bb0a4c4f5241667d3834f85038878a9ead2bb58f359"} Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.263056 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e8fee168bd2bf22daf3bb0a4c4f5241667d3834f85038878a9ead2bb58f359" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.263129 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.373824 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq"] Mar 18 13:41:34 crc kubenswrapper[4912]: E0318 13:41:34.374489 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560fc636-c082-4462-935f-1323ed49eef4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.374510 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="560fc636-c082-4462-935f-1323ed49eef4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.374777 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="560fc636-c082-4462-935f-1323ed49eef4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.375716 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.380677 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.380814 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381282 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381349 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381397 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381438 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381915 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.381923 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.382730 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.393568 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq"] Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.445681 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.445746 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.445928 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446048 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446143 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446177 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446216 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446285 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446332 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446363 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446422 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446470 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446491 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446551 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkvwq\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446586 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.446617 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549559 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549627 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549657 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549687 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549728 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549747 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549788 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549839 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.549876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550242 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550275 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550304 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550336 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550372 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkvwq\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550404 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.550445 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.557206 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.557300 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.558158 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.559015 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.559445 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.559615 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.560165 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.560918 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.561136 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.561287 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.561700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.562318 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.562543 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.563271 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.568748 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkvwq\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.570855 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:34 crc kubenswrapper[4912]: I0318 13:41:34.698068 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:41:35 crc kubenswrapper[4912]: I0318 13:41:35.245881 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq"] Mar 18 13:41:35 crc kubenswrapper[4912]: I0318 13:41:35.274752 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" event={"ID":"429eaf59-f68a-4347-8e97-77c61e6213e3","Type":"ContainerStarted","Data":"61bd19848301c0d2d7e5391a8d21b9e9141944d9c8a07a0a9bc0c77b8c3c944a"} Mar 18 13:41:37 crc kubenswrapper[4912]: I0318 13:41:37.351712 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" event={"ID":"429eaf59-f68a-4347-8e97-77c61e6213e3","Type":"ContainerStarted","Data":"ff385e494ed4afb3b9cd61e867b2fc4aa64d21686d535dc40b464b0053ef4ca8"} Mar 18 13:41:37 crc kubenswrapper[4912]: I0318 13:41:37.378470 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" podStartSLOduration=2.538153112 podStartE2EDuration="3.378446483s" podCreationTimestamp="2026-03-18 13:41:34 +0000 UTC" firstStartedPulling="2026-03-18 13:41:35.250015946 +0000 UTC m=+2343.709443371" lastFinishedPulling="2026-03-18 13:41:36.090309317 +0000 UTC m=+2344.549736742" observedRunningTime="2026-03-18 13:41:37.374760944 +0000 UTC m=+2345.834188389" watchObservedRunningTime="2026-03-18 13:41:37.378446483 +0000 UTC m=+2345.837873908" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.175731 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tvwgv"] Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.179500 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.190895 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.191222 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.203792 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tvwgv"] Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.213772 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.289182 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjfg\" (UniqueName: \"kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg\") pod \"auto-csr-approver-29564022-tvwgv\" (UID: \"09d0116e-2b66-4126-b2bb-9f7920bfcbd3\") " pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.392288 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjfg\" (UniqueName: \"kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg\") pod \"auto-csr-approver-29564022-tvwgv\" (UID: \"09d0116e-2b66-4126-b2bb-9f7920bfcbd3\") " pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.416604 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjfg\" (UniqueName: \"kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg\") pod \"auto-csr-approver-29564022-tvwgv\" (UID: \"09d0116e-2b66-4126-b2bb-9f7920bfcbd3\") " pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:00 crc kubenswrapper[4912]: I0318 13:42:00.514840 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:01 crc kubenswrapper[4912]: I0318 13:42:01.037186 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tvwgv"] Mar 18 13:42:01 crc kubenswrapper[4912]: I0318 13:42:01.635713 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" event={"ID":"09d0116e-2b66-4126-b2bb-9f7920bfcbd3","Type":"ContainerStarted","Data":"9515c86b7ce0f77b7d8d759dd661e4475e2f77c9b9418ccd21122e8cec193b42"} Mar 18 13:42:03 crc kubenswrapper[4912]: I0318 13:42:03.664769 4912 generic.go:334] "Generic (PLEG): container finished" podID="09d0116e-2b66-4126-b2bb-9f7920bfcbd3" containerID="9a569fdaec78e3c77047c4fdd079d7d4f968ec80f9f59210b33b30d531b5f9e3" exitCode=0 Mar 18 13:42:03 crc kubenswrapper[4912]: I0318 13:42:03.664891 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" event={"ID":"09d0116e-2b66-4126-b2bb-9f7920bfcbd3","Type":"ContainerDied","Data":"9a569fdaec78e3c77047c4fdd079d7d4f968ec80f9f59210b33b30d531b5f9e3"} Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.124796 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.267613 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qjfg\" (UniqueName: \"kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg\") pod \"09d0116e-2b66-4126-b2bb-9f7920bfcbd3\" (UID: \"09d0116e-2b66-4126-b2bb-9f7920bfcbd3\") " Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.275010 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg" (OuterVolumeSpecName: "kube-api-access-6qjfg") pod "09d0116e-2b66-4126-b2bb-9f7920bfcbd3" (UID: "09d0116e-2b66-4126-b2bb-9f7920bfcbd3"). InnerVolumeSpecName "kube-api-access-6qjfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.371304 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qjfg\" (UniqueName: \"kubernetes.io/projected/09d0116e-2b66-4126-b2bb-9f7920bfcbd3-kube-api-access-6qjfg\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.692304 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" event={"ID":"09d0116e-2b66-4126-b2bb-9f7920bfcbd3","Type":"ContainerDied","Data":"9515c86b7ce0f77b7d8d759dd661e4475e2f77c9b9418ccd21122e8cec193b42"} Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.692824 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9515c86b7ce0f77b7d8d759dd661e4475e2f77c9b9418ccd21122e8cec193b42" Mar 18 13:42:05 crc kubenswrapper[4912]: I0318 13:42:05.692403 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tvwgv" Mar 18 13:42:06 crc kubenswrapper[4912]: I0318 13:42:06.213390 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-fsw7b"] Mar 18 13:42:06 crc kubenswrapper[4912]: I0318 13:42:06.226486 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-fsw7b"] Mar 18 13:42:06 crc kubenswrapper[4912]: I0318 13:42:06.241687 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57140d4c-a1cf-431b-81f5-d702dab52543" path="/var/lib/kubelet/pods/57140d4c-a1cf-431b-81f5-d702dab52543/volumes" Mar 18 13:42:18 crc kubenswrapper[4912]: I0318 13:42:18.067099 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-4chng"] Mar 18 13:42:18 crc kubenswrapper[4912]: I0318 13:42:18.082203 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-4chng"] Mar 18 13:42:18 crc kubenswrapper[4912]: I0318 13:42:18.244003 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068bb242-8e37-448f-b647-ee255f9104b9" path="/var/lib/kubelet/pods/068bb242-8e37-448f-b647-ee255f9104b9/volumes" Mar 18 13:42:18 crc kubenswrapper[4912]: I0318 13:42:18.860880 4912 generic.go:334] "Generic (PLEG): container finished" podID="429eaf59-f68a-4347-8e97-77c61e6213e3" containerID="ff385e494ed4afb3b9cd61e867b2fc4aa64d21686d535dc40b464b0053ef4ca8" exitCode=0 Mar 18 13:42:18 crc kubenswrapper[4912]: I0318 13:42:18.860949 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" event={"ID":"429eaf59-f68a-4347-8e97-77c61e6213e3","Type":"ContainerDied","Data":"ff385e494ed4afb3b9cd61e867b2fc4aa64d21686d535dc40b464b0053ef4ca8"} Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.419607 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.475775 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.475891 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476108 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkvwq\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476162 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476235 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476286 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476324 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476419 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476498 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476541 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476606 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476636 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476665 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476722 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476771 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.476794 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle\") pod \"429eaf59-f68a-4347-8e97-77c61e6213e3\" (UID: \"429eaf59-f68a-4347-8e97-77c61e6213e3\") " Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.486852 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.488805 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq" (OuterVolumeSpecName: "kube-api-access-vkvwq") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "kube-api-access-vkvwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.489177 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.489248 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.489276 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.489329 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.489438 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.490108 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.490482 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.490824 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.493414 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.494367 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.497556 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.510785 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.527782 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory" (OuterVolumeSpecName: "inventory") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.534355 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "429eaf59-f68a-4347-8e97-77c61e6213e3" (UID: "429eaf59-f68a-4347-8e97-77c61e6213e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579859 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579900 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579917 4912 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579930 4912 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579941 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579951 4912 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579961 4912 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579970 4912 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579979 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkvwq\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-kube-api-access-vkvwq\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579989 4912 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.579998 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.580007 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.580018 4912 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.580027 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.580051 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/429eaf59-f68a-4347-8e97-77c61e6213e3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.580062 4912 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429eaf59-f68a-4347-8e97-77c61e6213e3-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.889784 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" event={"ID":"429eaf59-f68a-4347-8e97-77c61e6213e3","Type":"ContainerDied","Data":"61bd19848301c0d2d7e5391a8d21b9e9141944d9c8a07a0a9bc0c77b8c3c944a"} Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.889833 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61bd19848301c0d2d7e5391a8d21b9e9141944d9c8a07a0a9bc0c77b8c3c944a" Mar 18 13:42:20 crc kubenswrapper[4912]: I0318 13:42:20.889831 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.015462 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4"] Mar 18 13:42:21 crc kubenswrapper[4912]: E0318 13:42:21.016575 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d0116e-2b66-4126-b2bb-9f7920bfcbd3" containerName="oc" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.016599 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d0116e-2b66-4126-b2bb-9f7920bfcbd3" containerName="oc" Mar 18 13:42:21 crc kubenswrapper[4912]: E0318 13:42:21.016622 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429eaf59-f68a-4347-8e97-77c61e6213e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.016633 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="429eaf59-f68a-4347-8e97-77c61e6213e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.016929 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d0116e-2b66-4126-b2bb-9f7920bfcbd3" containerName="oc" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.016977 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="429eaf59-f68a-4347-8e97-77c61e6213e3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.018140 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.021764 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.022151 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.022383 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.022543 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.023494 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.023863 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4"] Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.092791 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.093029 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p22fv\" (UniqueName: \"kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.093311 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.093447 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.093543 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.195898 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p22fv\" (UniqueName: \"kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.196060 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.196121 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.196187 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.196433 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.197166 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.202000 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.203355 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.207737 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.214208 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p22fv\" (UniqueName: \"kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jljj4\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.356873 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:42:21 crc kubenswrapper[4912]: I0318 13:42:21.924684 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4"] Mar 18 13:42:21 crc kubenswrapper[4912]: W0318 13:42:21.928253 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd0ee26_077e_472f_9bfb_0f6247895102.slice/crio-4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5 WatchSource:0}: Error finding container 4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5: Status 404 returned error can't find the container with id 4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5 Mar 18 13:42:22 crc kubenswrapper[4912]: I0318 13:42:22.915016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" event={"ID":"2fd0ee26-077e-472f-9bfb-0f6247895102","Type":"ContainerStarted","Data":"4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5"} Mar 18 13:42:23 crc kubenswrapper[4912]: I0318 13:42:23.926115 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" event={"ID":"2fd0ee26-077e-472f-9bfb-0f6247895102","Type":"ContainerStarted","Data":"9cb674d1d80593d18076ab0b090e1bc88444528e3598b4d357fa7be6b221f0aa"} Mar 18 13:42:23 crc kubenswrapper[4912]: I0318 13:42:23.955114 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" podStartSLOduration=2.30110033 podStartE2EDuration="3.955089518s" podCreationTimestamp="2026-03-18 13:42:20 +0000 UTC" firstStartedPulling="2026-03-18 13:42:21.931206149 +0000 UTC m=+2390.390633574" lastFinishedPulling="2026-03-18 13:42:23.585195327 +0000 UTC m=+2392.044622762" observedRunningTime="2026-03-18 13:42:23.947549596 +0000 UTC m=+2392.406977041" watchObservedRunningTime="2026-03-18 13:42:23.955089518 +0000 UTC m=+2392.414516943" Mar 18 13:43:02 crc kubenswrapper[4912]: I0318 13:43:02.920630 4912 scope.go:117] "RemoveContainer" containerID="9e30b972648fb32f4fe04b9acc73cb5d03fcf332ac1b29f167e2a4018ca209cb" Mar 18 13:43:02 crc kubenswrapper[4912]: I0318 13:43:02.993751 4912 scope.go:117] "RemoveContainer" containerID="ad249e259e6551d81edf9dc8e6289145da4a2416361639f9e86448849c665ba3" Mar 18 13:43:03 crc kubenswrapper[4912]: I0318 13:43:03.099709 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-qgkzn"] Mar 18 13:43:03 crc kubenswrapper[4912]: I0318 13:43:03.154653 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-qgkzn"] Mar 18 13:43:04 crc kubenswrapper[4912]: I0318 13:43:04.242896 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880a8bf3-227a-44a5-89ef-ee032d977775" path="/var/lib/kubelet/pods/880a8bf3-227a-44a5-89ef-ee032d977775/volumes" Mar 18 13:43:06 crc kubenswrapper[4912]: I0318 13:43:06.998901 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:43:07 crc kubenswrapper[4912]: I0318 13:43:06.999587 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:43:22 crc kubenswrapper[4912]: E0318 13:43:22.107144 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd0ee26_077e_472f_9bfb_0f6247895102.slice/crio-conmon-9cb674d1d80593d18076ab0b090e1bc88444528e3598b4d357fa7be6b221f0aa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd0ee26_077e_472f_9bfb_0f6247895102.slice/crio-9cb674d1d80593d18076ab0b090e1bc88444528e3598b4d357fa7be6b221f0aa.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:43:22 crc kubenswrapper[4912]: I0318 13:43:22.652773 4912 generic.go:334] "Generic (PLEG): container finished" podID="2fd0ee26-077e-472f-9bfb-0f6247895102" containerID="9cb674d1d80593d18076ab0b090e1bc88444528e3598b4d357fa7be6b221f0aa" exitCode=0 Mar 18 13:43:22 crc kubenswrapper[4912]: I0318 13:43:22.653281 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" event={"ID":"2fd0ee26-077e-472f-9bfb-0f6247895102","Type":"ContainerDied","Data":"9cb674d1d80593d18076ab0b090e1bc88444528e3598b4d357fa7be6b221f0aa"} Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.228952 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.322675 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p22fv\" (UniqueName: \"kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv\") pod \"2fd0ee26-077e-472f-9bfb-0f6247895102\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.322888 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory\") pod \"2fd0ee26-077e-472f-9bfb-0f6247895102\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.323008 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam\") pod \"2fd0ee26-077e-472f-9bfb-0f6247895102\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.323100 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle\") pod \"2fd0ee26-077e-472f-9bfb-0f6247895102\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.323140 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0\") pod \"2fd0ee26-077e-472f-9bfb-0f6247895102\" (UID: \"2fd0ee26-077e-472f-9bfb-0f6247895102\") " Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.330283 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2fd0ee26-077e-472f-9bfb-0f6247895102" (UID: "2fd0ee26-077e-472f-9bfb-0f6247895102"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.333131 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv" (OuterVolumeSpecName: "kube-api-access-p22fv") pod "2fd0ee26-077e-472f-9bfb-0f6247895102" (UID: "2fd0ee26-077e-472f-9bfb-0f6247895102"). InnerVolumeSpecName "kube-api-access-p22fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.356554 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2fd0ee26-077e-472f-9bfb-0f6247895102" (UID: "2fd0ee26-077e-472f-9bfb-0f6247895102"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.361355 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory" (OuterVolumeSpecName: "inventory") pod "2fd0ee26-077e-472f-9bfb-0f6247895102" (UID: "2fd0ee26-077e-472f-9bfb-0f6247895102"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.364515 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fd0ee26-077e-472f-9bfb-0f6247895102" (UID: "2fd0ee26-077e-472f-9bfb-0f6247895102"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.426687 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.426723 4912 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.426733 4912 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2fd0ee26-077e-472f-9bfb-0f6247895102-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.426743 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p22fv\" (UniqueName: \"kubernetes.io/projected/2fd0ee26-077e-472f-9bfb-0f6247895102-kube-api-access-p22fv\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.426755 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd0ee26-077e-472f-9bfb-0f6247895102-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.683833 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" event={"ID":"2fd0ee26-077e-472f-9bfb-0f6247895102","Type":"ContainerDied","Data":"4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5"} Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.683883 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4391f56f7e8e612f880da3a421c9ff2366ecbe4055e0ab5976d09b87b3fc59e5" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.683924 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jljj4" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.799999 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh"] Mar 18 13:43:24 crc kubenswrapper[4912]: E0318 13:43:24.800781 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd0ee26-077e-472f-9bfb-0f6247895102" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.800814 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd0ee26-077e-472f-9bfb-0f6247895102" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.801269 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd0ee26-077e-472f-9bfb-0f6247895102" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.802542 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.805441 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.805822 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.806639 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.806644 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.806883 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.808130 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.812445 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh"] Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.940792 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.940854 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.940906 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.940937 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.941310 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:24 crc kubenswrapper[4912]: I0318 13:43:24.941445 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8b9\" (UniqueName: \"kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043691 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043837 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8b9\" (UniqueName: \"kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043896 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043914 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043947 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.043971 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.050297 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.050883 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.052983 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.053129 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.053129 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.063593 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8b9\" (UniqueName: \"kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.123193 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.738840 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh"] Mar 18 13:43:25 crc kubenswrapper[4912]: I0318 13:43:25.745812 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:43:26 crc kubenswrapper[4912]: I0318 13:43:26.709161 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" event={"ID":"0e4b3676-5cda-4170-86e4-ce503ec22aa0","Type":"ContainerStarted","Data":"b40eaeca47a94ad94adb7d8312ca9aa17e219d7b52205a9ee99ca365405f1e58"} Mar 18 13:43:30 crc kubenswrapper[4912]: I0318 13:43:30.761743 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" event={"ID":"0e4b3676-5cda-4170-86e4-ce503ec22aa0","Type":"ContainerStarted","Data":"3b00a7bb3b93c57dd3c2ec919507ef237bf55c913e18e9163dfe2b55cd397675"} Mar 18 13:43:30 crc kubenswrapper[4912]: I0318 13:43:30.793987 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" podStartSLOduration=2.303416605 podStartE2EDuration="6.793962441s" podCreationTimestamp="2026-03-18 13:43:24 +0000 UTC" firstStartedPulling="2026-03-18 13:43:25.745563987 +0000 UTC m=+2454.204991412" lastFinishedPulling="2026-03-18 13:43:30.236109823 +0000 UTC m=+2458.695537248" observedRunningTime="2026-03-18 13:43:30.782573274 +0000 UTC m=+2459.242000719" watchObservedRunningTime="2026-03-18 13:43:30.793962441 +0000 UTC m=+2459.253389856" Mar 18 13:43:36 crc kubenswrapper[4912]: I0318 13:43:36.998651 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:43:37 crc kubenswrapper[4912]: I0318 13:43:36.999580 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.153066 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564024-vf9tx"] Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.155967 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.161538 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.161541 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.162530 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.171703 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-vf9tx"] Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.236716 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkqf\" (UniqueName: \"kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf\") pod \"auto-csr-approver-29564024-vf9tx\" (UID: \"88cc1d26-0b9e-4b34-9280-848826e923c0\") " pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.340005 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkqf\" (UniqueName: \"kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf\") pod \"auto-csr-approver-29564024-vf9tx\" (UID: \"88cc1d26-0b9e-4b34-9280-848826e923c0\") " pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.364227 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkqf\" (UniqueName: \"kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf\") pod \"auto-csr-approver-29564024-vf9tx\" (UID: \"88cc1d26-0b9e-4b34-9280-848826e923c0\") " pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:00 crc kubenswrapper[4912]: I0318 13:44:00.481890 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:01 crc kubenswrapper[4912]: I0318 13:44:01.015783 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-vf9tx"] Mar 18 13:44:01 crc kubenswrapper[4912]: I0318 13:44:01.121246 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" event={"ID":"88cc1d26-0b9e-4b34-9280-848826e923c0","Type":"ContainerStarted","Data":"ad274b2815273b130c7fdcddae686d56abe5ef8f128f9e54937f0bdc547c81b1"} Mar 18 13:44:03 crc kubenswrapper[4912]: I0318 13:44:03.146807 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" event={"ID":"88cc1d26-0b9e-4b34-9280-848826e923c0","Type":"ContainerStarted","Data":"83f5672d1b9f04d1fb404e096c1ba071bd79aa1597a17f73e2e4c2216a59af25"} Mar 18 13:44:03 crc kubenswrapper[4912]: I0318 13:44:03.176377 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" podStartSLOduration=1.6308647889999999 podStartE2EDuration="3.17635498s" podCreationTimestamp="2026-03-18 13:44:00 +0000 UTC" firstStartedPulling="2026-03-18 13:44:01.017877334 +0000 UTC m=+2489.477304759" lastFinishedPulling="2026-03-18 13:44:02.563367525 +0000 UTC m=+2491.022794950" observedRunningTime="2026-03-18 13:44:03.172444354 +0000 UTC m=+2491.631871779" watchObservedRunningTime="2026-03-18 13:44:03.17635498 +0000 UTC m=+2491.635782405" Mar 18 13:44:03 crc kubenswrapper[4912]: I0318 13:44:03.189659 4912 scope.go:117] "RemoveContainer" containerID="f2978ffac8f239da1dfd31725e618d0fe53b53fa43c2c2886a7f1ffd2540d727" Mar 18 13:44:04 crc kubenswrapper[4912]: I0318 13:44:04.170543 4912 generic.go:334] "Generic (PLEG): container finished" podID="88cc1d26-0b9e-4b34-9280-848826e923c0" containerID="83f5672d1b9f04d1fb404e096c1ba071bd79aa1597a17f73e2e4c2216a59af25" exitCode=0 Mar 18 13:44:04 crc kubenswrapper[4912]: I0318 13:44:04.171087 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" event={"ID":"88cc1d26-0b9e-4b34-9280-848826e923c0","Type":"ContainerDied","Data":"83f5672d1b9f04d1fb404e096c1ba071bd79aa1597a17f73e2e4c2216a59af25"} Mar 18 13:44:05 crc kubenswrapper[4912]: I0318 13:44:05.651972 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:05 crc kubenswrapper[4912]: I0318 13:44:05.711934 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkqf\" (UniqueName: \"kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf\") pod \"88cc1d26-0b9e-4b34-9280-848826e923c0\" (UID: \"88cc1d26-0b9e-4b34-9280-848826e923c0\") " Mar 18 13:44:05 crc kubenswrapper[4912]: I0318 13:44:05.721619 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf" (OuterVolumeSpecName: "kube-api-access-5hkqf") pod "88cc1d26-0b9e-4b34-9280-848826e923c0" (UID: "88cc1d26-0b9e-4b34-9280-848826e923c0"). InnerVolumeSpecName "kube-api-access-5hkqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:05 crc kubenswrapper[4912]: I0318 13:44:05.818100 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkqf\" (UniqueName: \"kubernetes.io/projected/88cc1d26-0b9e-4b34-9280-848826e923c0-kube-api-access-5hkqf\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.194028 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.193945 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-vf9tx" event={"ID":"88cc1d26-0b9e-4b34-9280-848826e923c0","Type":"ContainerDied","Data":"ad274b2815273b130c7fdcddae686d56abe5ef8f128f9e54937f0bdc547c81b1"} Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.204338 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad274b2815273b130c7fdcddae686d56abe5ef8f128f9e54937f0bdc547c81b1" Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.742596 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-dnn8m"] Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.755520 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-dnn8m"] Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.999571 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.999671 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:44:06 crc kubenswrapper[4912]: I0318 13:44:06.999744 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.001144 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.001230 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" gracePeriod=600 Mar 18 13:44:07 crc kubenswrapper[4912]: E0318 13:44:07.136457 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.206959 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" exitCode=0 Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.207059 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753"} Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.207132 4912 scope.go:117] "RemoveContainer" containerID="02489b7ecd2b498e233504537b25815fc33d2b81d54b04defc2536e094a7ae21" Mar 18 13:44:07 crc kubenswrapper[4912]: I0318 13:44:07.208201 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:44:07 crc kubenswrapper[4912]: E0318 13:44:07.208761 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:44:08 crc kubenswrapper[4912]: I0318 13:44:08.240987 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35be5a84-2210-42f0-9e5c-dcbfcc58dad2" path="/var/lib/kubelet/pods/35be5a84-2210-42f0-9e5c-dcbfcc58dad2/volumes" Mar 18 13:44:18 crc kubenswrapper[4912]: I0318 13:44:18.376556 4912 generic.go:334] "Generic (PLEG): container finished" podID="0e4b3676-5cda-4170-86e4-ce503ec22aa0" containerID="3b00a7bb3b93c57dd3c2ec919507ef237bf55c913e18e9163dfe2b55cd397675" exitCode=0 Mar 18 13:44:18 crc kubenswrapper[4912]: I0318 13:44:18.376910 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" event={"ID":"0e4b3676-5cda-4170-86e4-ce503ec22aa0","Type":"ContainerDied","Data":"3b00a7bb3b93c57dd3c2ec919507ef237bf55c913e18e9163dfe2b55cd397675"} Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.888086 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.917723 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.917823 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.917900 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.918061 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.918128 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8b9\" (UniqueName: \"kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.918217 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\" (UID: \"0e4b3676-5cda-4170-86e4-ce503ec22aa0\") " Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.924679 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.935395 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9" (OuterVolumeSpecName: "kube-api-access-dc8b9") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "kube-api-access-dc8b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.964547 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.972854 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory" (OuterVolumeSpecName: "inventory") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.974598 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:19 crc kubenswrapper[4912]: I0318 13:44:19.978028 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0e4b3676-5cda-4170-86e4-ce503ec22aa0" (UID: "0e4b3676-5cda-4170-86e4-ce503ec22aa0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022357 4912 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022464 4912 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022479 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022490 4912 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022506 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4b3676-5cda-4170-86e4-ce503ec22aa0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.022515 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8b9\" (UniqueName: \"kubernetes.io/projected/0e4b3676-5cda-4170-86e4-ce503ec22aa0-kube-api-access-dc8b9\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.404943 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" event={"ID":"0e4b3676-5cda-4170-86e4-ce503ec22aa0","Type":"ContainerDied","Data":"b40eaeca47a94ad94adb7d8312ca9aa17e219d7b52205a9ee99ca365405f1e58"} Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.405017 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b40eaeca47a94ad94adb7d8312ca9aa17e219d7b52205a9ee99ca365405f1e58" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.405175 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.530994 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj"] Mar 18 13:44:20 crc kubenswrapper[4912]: E0318 13:44:20.531989 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc1d26-0b9e-4b34-9280-848826e923c0" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.532017 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc1d26-0b9e-4b34-9280-848826e923c0" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4912]: E0318 13:44:20.532067 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4b3676-5cda-4170-86e4-ce503ec22aa0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.532112 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4b3676-5cda-4170-86e4-ce503ec22aa0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.532598 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cc1d26-0b9e-4b34-9280-848826e923c0" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.532675 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4b3676-5cda-4170-86e4-ce503ec22aa0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.534429 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.541394 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.541432 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.541760 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.543234 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.543434 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.551878 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj"] Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.566721 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfjn\" (UniqueName: \"kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.566816 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.566920 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.566995 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.567017 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.668829 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.669065 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.669155 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.669177 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.669268 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfjn\" (UniqueName: \"kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.673177 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.673341 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.673605 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.677918 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.692809 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfjn\" (UniqueName: \"kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:20 crc kubenswrapper[4912]: I0318 13:44:20.864714 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:44:21 crc kubenswrapper[4912]: I0318 13:44:21.228222 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:44:21 crc kubenswrapper[4912]: E0318 13:44:21.229658 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:44:21 crc kubenswrapper[4912]: I0318 13:44:21.473361 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj"] Mar 18 13:44:22 crc kubenswrapper[4912]: I0318 13:44:22.430652 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" event={"ID":"04124f84-2385-4e87-b1c6-ac325ca92d7a","Type":"ContainerStarted","Data":"a4c8ff332da61d727cf7e60f4e267d14101d0b9e2ef2c0b087c2e6fcc4e34ba0"} Mar 18 13:44:23 crc kubenswrapper[4912]: I0318 13:44:23.445018 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" event={"ID":"04124f84-2385-4e87-b1c6-ac325ca92d7a","Type":"ContainerStarted","Data":"c9986267604994e043651836892a38fb604d07033d5b8b0fd63a68678a2f5b98"} Mar 18 13:44:23 crc kubenswrapper[4912]: I0318 13:44:23.480201 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" podStartSLOduration=2.553976915 podStartE2EDuration="3.48017469s" podCreationTimestamp="2026-03-18 13:44:20 +0000 UTC" firstStartedPulling="2026-03-18 13:44:21.483518008 +0000 UTC m=+2509.942945433" lastFinishedPulling="2026-03-18 13:44:22.409715783 +0000 UTC m=+2510.869143208" observedRunningTime="2026-03-18 13:44:23.467270351 +0000 UTC m=+2511.926697786" watchObservedRunningTime="2026-03-18 13:44:23.48017469 +0000 UTC m=+2511.939602115" Mar 18 13:44:36 crc kubenswrapper[4912]: I0318 13:44:36.229080 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:44:36 crc kubenswrapper[4912]: E0318 13:44:36.230284 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:44:49 crc kubenswrapper[4912]: I0318 13:44:49.228456 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:44:49 crc kubenswrapper[4912]: E0318 13:44:49.229380 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.155397 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2"] Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.158343 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.161933 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.164378 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.191288 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2"] Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.218615 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.220109 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.220297 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4b7\" (UniqueName: \"kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.230587 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:45:00 crc kubenswrapper[4912]: E0318 13:45:00.231005 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.324542 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4b7\" (UniqueName: \"kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.325254 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.325568 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.328477 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.335525 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.345439 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4b7\" (UniqueName: \"kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7\") pod \"collect-profiles-29564025-5sjt2\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.489252 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:00 crc kubenswrapper[4912]: I0318 13:45:00.970593 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2"] Mar 18 13:45:01 crc kubenswrapper[4912]: I0318 13:45:01.901459 4912 generic.go:334] "Generic (PLEG): container finished" podID="c39e1ae8-1b4b-4562-aa08-04d7a1023654" containerID="ba245ed04fc2e89a406c2c3c06ca334c776ff4821974439487e5cdcdd41faa32" exitCode=0 Mar 18 13:45:01 crc kubenswrapper[4912]: I0318 13:45:01.901947 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" event={"ID":"c39e1ae8-1b4b-4562-aa08-04d7a1023654","Type":"ContainerDied","Data":"ba245ed04fc2e89a406c2c3c06ca334c776ff4821974439487e5cdcdd41faa32"} Mar 18 13:45:01 crc kubenswrapper[4912]: I0318 13:45:01.901992 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" event={"ID":"c39e1ae8-1b4b-4562-aa08-04d7a1023654","Type":"ContainerStarted","Data":"c41a91176ac640233884ac783dfb9740bdb7af9eb5eea72e900eb176fd87b54b"} Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.314773 4912 scope.go:117] "RemoveContainer" containerID="aa655a9ea2f76dc71292ee1353cfbb5ccdb958cd9c458903d5b176394d6c824f" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.499031 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.514141 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj4b7\" (UniqueName: \"kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7\") pod \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.514451 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume\") pod \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.514751 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume\") pod \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\" (UID: \"c39e1ae8-1b4b-4562-aa08-04d7a1023654\") " Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.515736 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume" (OuterVolumeSpecName: "config-volume") pod "c39e1ae8-1b4b-4562-aa08-04d7a1023654" (UID: "c39e1ae8-1b4b-4562-aa08-04d7a1023654"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.520797 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e1ae8-1b4b-4562-aa08-04d7a1023654-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.525583 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7" (OuterVolumeSpecName: "kube-api-access-zj4b7") pod "c39e1ae8-1b4b-4562-aa08-04d7a1023654" (UID: "c39e1ae8-1b4b-4562-aa08-04d7a1023654"). InnerVolumeSpecName "kube-api-access-zj4b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.527242 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c39e1ae8-1b4b-4562-aa08-04d7a1023654" (UID: "c39e1ae8-1b4b-4562-aa08-04d7a1023654"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.623764 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e1ae8-1b4b-4562-aa08-04d7a1023654-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.623821 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj4b7\" (UniqueName: \"kubernetes.io/projected/c39e1ae8-1b4b-4562-aa08-04d7a1023654-kube-api-access-zj4b7\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.926266 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" event={"ID":"c39e1ae8-1b4b-4562-aa08-04d7a1023654","Type":"ContainerDied","Data":"c41a91176ac640233884ac783dfb9740bdb7af9eb5eea72e900eb176fd87b54b"} Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.926317 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c41a91176ac640233884ac783dfb9740bdb7af9eb5eea72e900eb176fd87b54b" Mar 18 13:45:03 crc kubenswrapper[4912]: I0318 13:45:03.926351 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2" Mar 18 13:45:04 crc kubenswrapper[4912]: I0318 13:45:04.591538 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f"] Mar 18 13:45:04 crc kubenswrapper[4912]: I0318 13:45:04.602904 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-8kx5f"] Mar 18 13:45:06 crc kubenswrapper[4912]: I0318 13:45:06.246520 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f844c0-ca4c-4097-bedd-bbb4323cc717" path="/var/lib/kubelet/pods/d5f844c0-ca4c-4097-bedd-bbb4323cc717/volumes" Mar 18 13:45:11 crc kubenswrapper[4912]: I0318 13:45:11.228706 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:45:11 crc kubenswrapper[4912]: E0318 13:45:11.229763 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:45:26 crc kubenswrapper[4912]: I0318 13:45:26.228521 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:45:26 crc kubenswrapper[4912]: E0318 13:45:26.230591 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:45:38 crc kubenswrapper[4912]: I0318 13:45:38.227969 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:45:38 crc kubenswrapper[4912]: E0318 13:45:38.228837 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:45:53 crc kubenswrapper[4912]: I0318 13:45:53.229150 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:45:53 crc kubenswrapper[4912]: E0318 13:45:53.230283 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.162948 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564026-lrc4k"] Mar 18 13:46:00 crc kubenswrapper[4912]: E0318 13:46:00.166016 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39e1ae8-1b4b-4562-aa08-04d7a1023654" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.166185 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39e1ae8-1b4b-4562-aa08-04d7a1023654" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.166482 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39e1ae8-1b4b-4562-aa08-04d7a1023654" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.167641 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.171877 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.172301 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.172399 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.180459 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-lrc4k"] Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.365420 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdpn\" (UniqueName: \"kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn\") pod \"auto-csr-approver-29564026-lrc4k\" (UID: \"ae4c29c5-28a9-4dda-ab06-f018f3edf59c\") " pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.469057 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdpn\" (UniqueName: \"kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn\") pod \"auto-csr-approver-29564026-lrc4k\" (UID: \"ae4c29c5-28a9-4dda-ab06-f018f3edf59c\") " pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.492728 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdpn\" (UniqueName: \"kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn\") pod \"auto-csr-approver-29564026-lrc4k\" (UID: \"ae4c29c5-28a9-4dda-ab06-f018f3edf59c\") " pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.501852 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:00 crc kubenswrapper[4912]: I0318 13:46:00.991196 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-lrc4k"] Mar 18 13:46:01 crc kubenswrapper[4912]: I0318 13:46:01.634988 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" event={"ID":"ae4c29c5-28a9-4dda-ab06-f018f3edf59c","Type":"ContainerStarted","Data":"d4a79be59bad0c46636ca58081914e5c9621d36c45f23f5556c28b02abf4f513"} Mar 18 13:46:02 crc kubenswrapper[4912]: I0318 13:46:02.653939 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" event={"ID":"ae4c29c5-28a9-4dda-ab06-f018f3edf59c","Type":"ContainerStarted","Data":"c36dd6f5e632ffa18be867f240e33100d34ebd84fc166d4160017de4af57f4e6"} Mar 18 13:46:02 crc kubenswrapper[4912]: I0318 13:46:02.681735 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" podStartSLOduration=1.39036507 podStartE2EDuration="2.681709885s" podCreationTimestamp="2026-03-18 13:46:00 +0000 UTC" firstStartedPulling="2026-03-18 13:46:00.995399385 +0000 UTC m=+2609.454826810" lastFinishedPulling="2026-03-18 13:46:02.2867442 +0000 UTC m=+2610.746171625" observedRunningTime="2026-03-18 13:46:02.671448688 +0000 UTC m=+2611.130876123" watchObservedRunningTime="2026-03-18 13:46:02.681709885 +0000 UTC m=+2611.141137310" Mar 18 13:46:03 crc kubenswrapper[4912]: I0318 13:46:03.518990 4912 scope.go:117] "RemoveContainer" containerID="fea6cdf4fb3627a067c4cd496d9aa42111112ee10440ea9b47d6c59848717143" Mar 18 13:46:03 crc kubenswrapper[4912]: I0318 13:46:03.675568 4912 generic.go:334] "Generic (PLEG): container finished" podID="ae4c29c5-28a9-4dda-ab06-f018f3edf59c" containerID="c36dd6f5e632ffa18be867f240e33100d34ebd84fc166d4160017de4af57f4e6" exitCode=0 Mar 18 13:46:03 crc kubenswrapper[4912]: I0318 13:46:03.675618 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" event={"ID":"ae4c29c5-28a9-4dda-ab06-f018f3edf59c","Type":"ContainerDied","Data":"c36dd6f5e632ffa18be867f240e33100d34ebd84fc166d4160017de4af57f4e6"} Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.132233 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.319376 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsdpn\" (UniqueName: \"kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn\") pod \"ae4c29c5-28a9-4dda-ab06-f018f3edf59c\" (UID: \"ae4c29c5-28a9-4dda-ab06-f018f3edf59c\") " Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.326323 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn" (OuterVolumeSpecName: "kube-api-access-lsdpn") pod "ae4c29c5-28a9-4dda-ab06-f018f3edf59c" (UID: "ae4c29c5-28a9-4dda-ab06-f018f3edf59c"). InnerVolumeSpecName "kube-api-access-lsdpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.349693 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-fjsrf"] Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.359757 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-fjsrf"] Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.423883 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsdpn\" (UniqueName: \"kubernetes.io/projected/ae4c29c5-28a9-4dda-ab06-f018f3edf59c-kube-api-access-lsdpn\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.700211 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.700238 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-lrc4k" event={"ID":"ae4c29c5-28a9-4dda-ab06-f018f3edf59c","Type":"ContainerDied","Data":"d4a79be59bad0c46636ca58081914e5c9621d36c45f23f5556c28b02abf4f513"} Mar 18 13:46:05 crc kubenswrapper[4912]: I0318 13:46:05.700274 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a79be59bad0c46636ca58081914e5c9621d36c45f23f5556c28b02abf4f513" Mar 18 13:46:06 crc kubenswrapper[4912]: I0318 13:46:06.229468 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:46:06 crc kubenswrapper[4912]: E0318 13:46:06.230371 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:46:06 crc kubenswrapper[4912]: I0318 13:46:06.243733 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264a6d26-eab2-4104-b0b4-bdd17eb770ed" path="/var/lib/kubelet/pods/264a6d26-eab2-4104-b0b4-bdd17eb770ed/volumes" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.309897 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:08 crc kubenswrapper[4912]: E0318 13:46:08.314375 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4c29c5-28a9-4dda-ab06-f018f3edf59c" containerName="oc" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.314435 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4c29c5-28a9-4dda-ab06-f018f3edf59c" containerName="oc" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.314938 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4c29c5-28a9-4dda-ab06-f018f3edf59c" containerName="oc" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.317890 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.333615 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.421029 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.421612 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzs65\" (UniqueName: \"kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.421951 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.525933 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.526229 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.526295 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzs65\" (UniqueName: \"kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.526718 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.526718 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.549340 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzs65\" (UniqueName: \"kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65\") pod \"redhat-marketplace-fw6np\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:08 crc kubenswrapper[4912]: I0318 13:46:08.657418 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:09 crc kubenswrapper[4912]: I0318 13:46:09.218994 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:09 crc kubenswrapper[4912]: I0318 13:46:09.769711 4912 generic.go:334] "Generic (PLEG): container finished" podID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerID="2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2" exitCode=0 Mar 18 13:46:09 crc kubenswrapper[4912]: I0318 13:46:09.769782 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerDied","Data":"2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2"} Mar 18 13:46:09 crc kubenswrapper[4912]: I0318 13:46:09.770237 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerStarted","Data":"66d7dc0cd5aef5219c0a3f765d7c8adf5dc6b94cf863b8e7a011857978a92878"} Mar 18 13:46:11 crc kubenswrapper[4912]: I0318 13:46:11.797404 4912 generic.go:334] "Generic (PLEG): container finished" podID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerID="06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148" exitCode=0 Mar 18 13:46:11 crc kubenswrapper[4912]: I0318 13:46:11.797487 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerDied","Data":"06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148"} Mar 18 13:46:12 crc kubenswrapper[4912]: I0318 13:46:12.821262 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerStarted","Data":"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9"} Mar 18 13:46:12 crc kubenswrapper[4912]: I0318 13:46:12.854310 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fw6np" podStartSLOduration=2.363795285 podStartE2EDuration="4.854285537s" podCreationTimestamp="2026-03-18 13:46:08 +0000 UTC" firstStartedPulling="2026-03-18 13:46:09.773736037 +0000 UTC m=+2618.233163462" lastFinishedPulling="2026-03-18 13:46:12.264226289 +0000 UTC m=+2620.723653714" observedRunningTime="2026-03-18 13:46:12.843464145 +0000 UTC m=+2621.302891580" watchObservedRunningTime="2026-03-18 13:46:12.854285537 +0000 UTC m=+2621.313712962" Mar 18 13:46:18 crc kubenswrapper[4912]: I0318 13:46:18.228427 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:46:18 crc kubenswrapper[4912]: E0318 13:46:18.229576 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:46:18 crc kubenswrapper[4912]: I0318 13:46:18.657604 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:18 crc kubenswrapper[4912]: I0318 13:46:18.657668 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:18 crc kubenswrapper[4912]: I0318 13:46:18.707530 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:18 crc kubenswrapper[4912]: I0318 13:46:18.962297 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:19 crc kubenswrapper[4912]: I0318 13:46:19.025546 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:20 crc kubenswrapper[4912]: I0318 13:46:20.934068 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fw6np" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="registry-server" containerID="cri-o://bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9" gracePeriod=2 Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.473212 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.600788 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzs65\" (UniqueName: \"kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65\") pod \"4baa5240-9364-41e2-a1f7-942fa758d50e\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.600887 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content\") pod \"4baa5240-9364-41e2-a1f7-942fa758d50e\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.600983 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities\") pod \"4baa5240-9364-41e2-a1f7-942fa758d50e\" (UID: \"4baa5240-9364-41e2-a1f7-942fa758d50e\") " Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.602209 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities" (OuterVolumeSpecName: "utilities") pod "4baa5240-9364-41e2-a1f7-942fa758d50e" (UID: "4baa5240-9364-41e2-a1f7-942fa758d50e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.610328 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65" (OuterVolumeSpecName: "kube-api-access-tzs65") pod "4baa5240-9364-41e2-a1f7-942fa758d50e" (UID: "4baa5240-9364-41e2-a1f7-942fa758d50e"). InnerVolumeSpecName "kube-api-access-tzs65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.666531 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4baa5240-9364-41e2-a1f7-942fa758d50e" (UID: "4baa5240-9364-41e2-a1f7-942fa758d50e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.704833 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzs65\" (UniqueName: \"kubernetes.io/projected/4baa5240-9364-41e2-a1f7-942fa758d50e-kube-api-access-tzs65\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.705259 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.705361 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa5240-9364-41e2-a1f7-942fa758d50e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.949906 4912 generic.go:334] "Generic (PLEG): container finished" podID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerID="bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9" exitCode=0 Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.949990 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerDied","Data":"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9"} Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.950029 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw6np" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.950035 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw6np" event={"ID":"4baa5240-9364-41e2-a1f7-942fa758d50e","Type":"ContainerDied","Data":"66d7dc0cd5aef5219c0a3f765d7c8adf5dc6b94cf863b8e7a011857978a92878"} Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.950098 4912 scope.go:117] "RemoveContainer" containerID="bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9" Mar 18 13:46:21 crc kubenswrapper[4912]: I0318 13:46:21.995281 4912 scope.go:117] "RemoveContainer" containerID="06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.001935 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.015265 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw6np"] Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.022195 4912 scope.go:117] "RemoveContainer" containerID="2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.090659 4912 scope.go:117] "RemoveContainer" containerID="bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9" Mar 18 13:46:22 crc kubenswrapper[4912]: E0318 13:46:22.093096 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9\": container with ID starting with bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9 not found: ID does not exist" containerID="bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.093197 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9"} err="failed to get container status \"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9\": rpc error: code = NotFound desc = could not find container \"bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9\": container with ID starting with bf31a5b6cfa0b5aab6459ae9493ba4b5784d25e0b8abba4ae153398fee8855c9 not found: ID does not exist" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.093240 4912 scope.go:117] "RemoveContainer" containerID="06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148" Mar 18 13:46:22 crc kubenswrapper[4912]: E0318 13:46:22.093773 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148\": container with ID starting with 06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148 not found: ID does not exist" containerID="06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.093817 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148"} err="failed to get container status \"06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148\": rpc error: code = NotFound desc = could not find container \"06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148\": container with ID starting with 06e2565dce9e4f2d3f7f8ff32ce977cbea2c6d6dc2618f7b339f93cae91bd148 not found: ID does not exist" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.093850 4912 scope.go:117] "RemoveContainer" containerID="2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2" Mar 18 13:46:22 crc kubenswrapper[4912]: E0318 13:46:22.094173 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2\": container with ID starting with 2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2 not found: ID does not exist" containerID="2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.094206 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2"} err="failed to get container status \"2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2\": rpc error: code = NotFound desc = could not find container \"2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2\": container with ID starting with 2918b5f4f70a8a5d05b17bb5f65a7beddac5fdbec0e34a989c17ac8e17a212d2 not found: ID does not exist" Mar 18 13:46:22 crc kubenswrapper[4912]: I0318 13:46:22.244097 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" path="/var/lib/kubelet/pods/4baa5240-9364-41e2-a1f7-942fa758d50e/volumes" Mar 18 13:46:31 crc kubenswrapper[4912]: I0318 13:46:31.228896 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:46:31 crc kubenswrapper[4912]: E0318 13:46:31.229967 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:46:44 crc kubenswrapper[4912]: I0318 13:46:44.228687 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:46:44 crc kubenswrapper[4912]: E0318 13:46:44.230294 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:46:58 crc kubenswrapper[4912]: I0318 13:46:58.228874 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:46:58 crc kubenswrapper[4912]: E0318 13:46:58.230337 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:47:03 crc kubenswrapper[4912]: I0318 13:47:03.602350 4912 scope.go:117] "RemoveContainer" containerID="d0a918a68f10e1d6549125370736916a44135734ea9852d673c60819f11a4744" Mar 18 13:47:09 crc kubenswrapper[4912]: I0318 13:47:09.228555 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:47:09 crc kubenswrapper[4912]: E0318 13:47:09.230003 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:47:24 crc kubenswrapper[4912]: I0318 13:47:24.229198 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:47:24 crc kubenswrapper[4912]: E0318 13:47:24.230654 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:47:39 crc kubenswrapper[4912]: I0318 13:47:39.228843 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:47:39 crc kubenswrapper[4912]: E0318 13:47:39.229960 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:47:52 crc kubenswrapper[4912]: I0318 13:47:52.236416 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:47:52 crc kubenswrapper[4912]: E0318 13:47:52.237543 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.170917 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564028-mq8nq"] Mar 18 13:48:00 crc kubenswrapper[4912]: E0318 13:48:00.173165 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.173194 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4912]: E0318 13:48:00.173262 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="extract-content" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.173273 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="extract-content" Mar 18 13:48:00 crc kubenswrapper[4912]: E0318 13:48:00.173310 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="extract-utilities" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.173320 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="extract-utilities" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.174122 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baa5240-9364-41e2-a1f7-942fa758d50e" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.175771 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.179814 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.180294 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.180558 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.184807 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-mq8nq"] Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.312990 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7nh\" (UniqueName: \"kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh\") pod \"auto-csr-approver-29564028-mq8nq\" (UID: \"038d624a-12c8-4de7-84b3-1d84c0e7ffe3\") " pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.415974 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7nh\" (UniqueName: \"kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh\") pod \"auto-csr-approver-29564028-mq8nq\" (UID: \"038d624a-12c8-4de7-84b3-1d84c0e7ffe3\") " pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.437909 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7nh\" (UniqueName: \"kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh\") pod \"auto-csr-approver-29564028-mq8nq\" (UID: \"038d624a-12c8-4de7-84b3-1d84c0e7ffe3\") " pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:00 crc kubenswrapper[4912]: I0318 13:48:00.511205 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:01 crc kubenswrapper[4912]: I0318 13:48:01.016164 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-mq8nq"] Mar 18 13:48:01 crc kubenswrapper[4912]: I0318 13:48:01.171479 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" event={"ID":"038d624a-12c8-4de7-84b3-1d84c0e7ffe3","Type":"ContainerStarted","Data":"3b5c3b7b50c064315472e2bf77e339b7d1cc4010a2665b5cc5e7780ed3e109d6"} Mar 18 13:48:03 crc kubenswrapper[4912]: I0318 13:48:03.202932 4912 generic.go:334] "Generic (PLEG): container finished" podID="038d624a-12c8-4de7-84b3-1d84c0e7ffe3" containerID="7c94629f3dbe7e99df3ee1a3e0890393b9d65c17d9608e5f598113cb13d1160e" exitCode=0 Mar 18 13:48:03 crc kubenswrapper[4912]: I0318 13:48:03.203063 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" event={"ID":"038d624a-12c8-4de7-84b3-1d84c0e7ffe3","Type":"ContainerDied","Data":"7c94629f3dbe7e99df3ee1a3e0890393b9d65c17d9608e5f598113cb13d1160e"} Mar 18 13:48:03 crc kubenswrapper[4912]: I0318 13:48:03.228803 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:48:03 crc kubenswrapper[4912]: E0318 13:48:03.229236 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:48:04 crc kubenswrapper[4912]: I0318 13:48:04.662423 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:04 crc kubenswrapper[4912]: I0318 13:48:04.762275 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z7nh\" (UniqueName: \"kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh\") pod \"038d624a-12c8-4de7-84b3-1d84c0e7ffe3\" (UID: \"038d624a-12c8-4de7-84b3-1d84c0e7ffe3\") " Mar 18 13:48:04 crc kubenswrapper[4912]: I0318 13:48:04.770916 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh" (OuterVolumeSpecName: "kube-api-access-9z7nh") pod "038d624a-12c8-4de7-84b3-1d84c0e7ffe3" (UID: "038d624a-12c8-4de7-84b3-1d84c0e7ffe3"). InnerVolumeSpecName "kube-api-access-9z7nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:48:04 crc kubenswrapper[4912]: I0318 13:48:04.866641 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z7nh\" (UniqueName: \"kubernetes.io/projected/038d624a-12c8-4de7-84b3-1d84c0e7ffe3-kube-api-access-9z7nh\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:05 crc kubenswrapper[4912]: I0318 13:48:05.230318 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" Mar 18 13:48:05 crc kubenswrapper[4912]: I0318 13:48:05.230297 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-mq8nq" event={"ID":"038d624a-12c8-4de7-84b3-1d84c0e7ffe3","Type":"ContainerDied","Data":"3b5c3b7b50c064315472e2bf77e339b7d1cc4010a2665b5cc5e7780ed3e109d6"} Mar 18 13:48:05 crc kubenswrapper[4912]: I0318 13:48:05.230527 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5c3b7b50c064315472e2bf77e339b7d1cc4010a2665b5cc5e7780ed3e109d6" Mar 18 13:48:05 crc kubenswrapper[4912]: I0318 13:48:05.743428 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tvwgv"] Mar 18 13:48:05 crc kubenswrapper[4912]: I0318 13:48:05.754912 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tvwgv"] Mar 18 13:48:06 crc kubenswrapper[4912]: I0318 13:48:06.242139 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d0116e-2b66-4126-b2bb-9f7920bfcbd3" path="/var/lib/kubelet/pods/09d0116e-2b66-4126-b2bb-9f7920bfcbd3/volumes" Mar 18 13:48:09 crc kubenswrapper[4912]: I0318 13:48:09.273515 4912 generic.go:334] "Generic (PLEG): container finished" podID="04124f84-2385-4e87-b1c6-ac325ca92d7a" containerID="c9986267604994e043651836892a38fb604d07033d5b8b0fd63a68678a2f5b98" exitCode=0 Mar 18 13:48:09 crc kubenswrapper[4912]: I0318 13:48:09.273617 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" event={"ID":"04124f84-2385-4e87-b1c6-ac325ca92d7a","Type":"ContainerDied","Data":"c9986267604994e043651836892a38fb604d07033d5b8b0fd63a68678a2f5b98"} Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.840903 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.949511 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam\") pod \"04124f84-2385-4e87-b1c6-ac325ca92d7a\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.949571 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tfjn\" (UniqueName: \"kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn\") pod \"04124f84-2385-4e87-b1c6-ac325ca92d7a\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.949648 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0\") pod \"04124f84-2385-4e87-b1c6-ac325ca92d7a\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.949833 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory\") pod \"04124f84-2385-4e87-b1c6-ac325ca92d7a\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.949932 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle\") pod \"04124f84-2385-4e87-b1c6-ac325ca92d7a\" (UID: \"04124f84-2385-4e87-b1c6-ac325ca92d7a\") " Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.957292 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "04124f84-2385-4e87-b1c6-ac325ca92d7a" (UID: "04124f84-2385-4e87-b1c6-ac325ca92d7a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.957437 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn" (OuterVolumeSpecName: "kube-api-access-9tfjn") pod "04124f84-2385-4e87-b1c6-ac325ca92d7a" (UID: "04124f84-2385-4e87-b1c6-ac325ca92d7a"). InnerVolumeSpecName "kube-api-access-9tfjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.986261 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "04124f84-2385-4e87-b1c6-ac325ca92d7a" (UID: "04124f84-2385-4e87-b1c6-ac325ca92d7a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.991588 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04124f84-2385-4e87-b1c6-ac325ca92d7a" (UID: "04124f84-2385-4e87-b1c6-ac325ca92d7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:48:10 crc kubenswrapper[4912]: I0318 13:48:10.994163 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory" (OuterVolumeSpecName: "inventory") pod "04124f84-2385-4e87-b1c6-ac325ca92d7a" (UID: "04124f84-2385-4e87-b1c6-ac325ca92d7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.052604 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.052652 4912 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.052666 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.052678 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tfjn\" (UniqueName: \"kubernetes.io/projected/04124f84-2385-4e87-b1c6-ac325ca92d7a-kube-api-access-9tfjn\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.052695 4912 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04124f84-2385-4e87-b1c6-ac325ca92d7a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.298591 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" event={"ID":"04124f84-2385-4e87-b1c6-ac325ca92d7a","Type":"ContainerDied","Data":"a4c8ff332da61d727cf7e60f4e267d14101d0b9e2ef2c0b087c2e6fcc4e34ba0"} Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.298652 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c8ff332da61d727cf7e60f4e267d14101d0b9e2ef2c0b087c2e6fcc4e34ba0" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.298696 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.481723 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k"] Mar 18 13:48:11 crc kubenswrapper[4912]: E0318 13:48:11.482429 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04124f84-2385-4e87-b1c6-ac325ca92d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.482463 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="04124f84-2385-4e87-b1c6-ac325ca92d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:48:11 crc kubenswrapper[4912]: E0318 13:48:11.482530 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038d624a-12c8-4de7-84b3-1d84c0e7ffe3" containerName="oc" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.482541 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d624a-12c8-4de7-84b3-1d84c0e7ffe3" containerName="oc" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.482855 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="04124f84-2385-4e87-b1c6-ac325ca92d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.482901 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="038d624a-12c8-4de7-84b3-1d84c0e7ffe3" containerName="oc" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.484004 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.487669 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.489081 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.490858 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.491120 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.491308 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.491385 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.491398 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.510008 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k"] Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567711 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567791 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567817 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567841 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567868 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5286\" (UniqueName: \"kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567902 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567926 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.567970 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.568026 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.568068 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.568104 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.670317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.670651 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.670814 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5286\" (UniqueName: \"kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.671434 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.672032 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.672352 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.672537 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.672884 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.673095 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.675631 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.676921 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.677209 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.677212 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.677501 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.677922 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.678103 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.678996 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.679657 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.680648 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.680738 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.689805 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.694988 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5286\" (UniqueName: \"kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7rh7k\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:11 crc kubenswrapper[4912]: I0318 13:48:11.805866 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:48:12 crc kubenswrapper[4912]: I0318 13:48:12.424873 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k"] Mar 18 13:48:13 crc kubenswrapper[4912]: I0318 13:48:13.330515 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" event={"ID":"3db96e35-5cad-42d1-afe8-bf48fa9ac92e","Type":"ContainerStarted","Data":"24fb9e286bc07b315b21d3bc12387dc0642c5cd319822853be74763612338876"} Mar 18 13:48:13 crc kubenswrapper[4912]: I0318 13:48:13.331072 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" event={"ID":"3db96e35-5cad-42d1-afe8-bf48fa9ac92e","Type":"ContainerStarted","Data":"7abeaaf1e860c899deded4acfd5de232e38ed4012ec489cfc3267181240d12be"} Mar 18 13:48:14 crc kubenswrapper[4912]: I0318 13:48:14.227976 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:48:14 crc kubenswrapper[4912]: E0318 13:48:14.228813 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:48:29 crc kubenswrapper[4912]: I0318 13:48:29.228218 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:48:29 crc kubenswrapper[4912]: E0318 13:48:29.229681 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:48:42 crc kubenswrapper[4912]: I0318 13:48:42.237071 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:48:42 crc kubenswrapper[4912]: E0318 13:48:42.238327 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:48:53 crc kubenswrapper[4912]: I0318 13:48:53.239304 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:48:53 crc kubenswrapper[4912]: E0318 13:48:53.240964 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:49:03 crc kubenswrapper[4912]: I0318 13:49:03.782065 4912 scope.go:117] "RemoveContainer" containerID="9a569fdaec78e3c77047c4fdd079d7d4f968ec80f9f59210b33b30d531b5f9e3" Mar 18 13:49:05 crc kubenswrapper[4912]: I0318 13:49:05.228512 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:49:05 crc kubenswrapper[4912]: E0318 13:49:05.228937 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:49:18 crc kubenswrapper[4912]: I0318 13:49:18.228991 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:49:19 crc kubenswrapper[4912]: I0318 13:49:19.148585 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f"} Mar 18 13:49:19 crc kubenswrapper[4912]: I0318 13:49:19.187983 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" podStartSLOduration=67.639093647 podStartE2EDuration="1m8.187949593s" podCreationTimestamp="2026-03-18 13:48:11 +0000 UTC" firstStartedPulling="2026-03-18 13:48:12.437107382 +0000 UTC m=+2740.896534807" lastFinishedPulling="2026-03-18 13:48:12.985963338 +0000 UTC m=+2741.445390753" observedRunningTime="2026-03-18 13:48:13.355550697 +0000 UTC m=+2741.814978122" watchObservedRunningTime="2026-03-18 13:49:19.187949593 +0000 UTC m=+2807.647377018" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.555359 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.560424 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.567969 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.760215 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.760267 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4bm\" (UniqueName: \"kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.760404 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.862887 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.863076 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.863118 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4bm\" (UniqueName: \"kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.863449 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.863600 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.892860 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4bm\" (UniqueName: \"kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm\") pod \"community-operators-f8d8f\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:45 crc kubenswrapper[4912]: I0318 13:49:45.898204 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:46 crc kubenswrapper[4912]: I0318 13:49:46.559632 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:49:46 crc kubenswrapper[4912]: I0318 13:49:46.712870 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerStarted","Data":"f8e132a3ca35a3df6f655443a163076af39123365a65bb649c599b1a0ad4cb48"} Mar 18 13:49:47 crc kubenswrapper[4912]: I0318 13:49:47.770011 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5041c12-6502-4d75-87dc-9c6367620d59" containerID="aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579" exitCode=0 Mar 18 13:49:47 crc kubenswrapper[4912]: I0318 13:49:47.770167 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerDied","Data":"aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579"} Mar 18 13:49:47 crc kubenswrapper[4912]: I0318 13:49:47.773879 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:49:49 crc kubenswrapper[4912]: I0318 13:49:49.812346 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerStarted","Data":"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e"} Mar 18 13:49:51 crc kubenswrapper[4912]: I0318 13:49:51.846176 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5041c12-6502-4d75-87dc-9c6367620d59" containerID="99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e" exitCode=0 Mar 18 13:49:51 crc kubenswrapper[4912]: I0318 13:49:51.846281 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerDied","Data":"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e"} Mar 18 13:49:52 crc kubenswrapper[4912]: I0318 13:49:52.863928 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerStarted","Data":"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df"} Mar 18 13:49:52 crc kubenswrapper[4912]: I0318 13:49:52.895391 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8d8f" podStartSLOduration=3.310804343 podStartE2EDuration="7.895363524s" podCreationTimestamp="2026-03-18 13:49:45 +0000 UTC" firstStartedPulling="2026-03-18 13:49:47.773648772 +0000 UTC m=+2836.233076197" lastFinishedPulling="2026-03-18 13:49:52.358207953 +0000 UTC m=+2840.817635378" observedRunningTime="2026-03-18 13:49:52.888732455 +0000 UTC m=+2841.348159890" watchObservedRunningTime="2026-03-18 13:49:52.895363524 +0000 UTC m=+2841.354790949" Mar 18 13:49:55 crc kubenswrapper[4912]: I0318 13:49:55.899173 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:55 crc kubenswrapper[4912]: I0318 13:49:55.901100 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:55 crc kubenswrapper[4912]: I0318 13:49:55.960702 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.044193 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.048412 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.062596 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.167070 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4l65\" (UniqueName: \"kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.167510 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.167813 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.270860 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4l65\" (UniqueName: \"kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.271083 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.271146 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.271946 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.272169 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.304128 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4l65\" (UniqueName: \"kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65\") pod \"certified-operators-mtzsh\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.385098 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:49:58 crc kubenswrapper[4912]: I0318 13:49:58.997574 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.419162 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.426769 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.439086 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.515249 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hjrk\" (UniqueName: \"kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.515821 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.516000 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.619328 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hjrk\" (UniqueName: \"kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.619500 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.619680 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.620332 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.620362 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.651236 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hjrk\" (UniqueName: \"kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk\") pod \"redhat-operators-qgpjg\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.761173 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.971132 4912 generic.go:334] "Generic (PLEG): container finished" podID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerID="be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc" exitCode=0 Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.971327 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerDied","Data":"be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc"} Mar 18 13:49:59 crc kubenswrapper[4912]: I0318 13:49:59.971473 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerStarted","Data":"d32f2a4fcab9b8a50d8c847b113adeae4e676b5ffb53fcab0adbd8e6277c1471"} Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.165637 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564030-bxv8t"] Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.167838 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.170677 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.171081 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.171245 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.194679 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-bxv8t"] Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.249900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzmsb\" (UniqueName: \"kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb\") pod \"auto-csr-approver-29564030-bxv8t\" (UID: \"934d7b04-df5d-4a73-a650-45b970a7a96e\") " pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.353090 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzmsb\" (UniqueName: \"kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb\") pod \"auto-csr-approver-29564030-bxv8t\" (UID: \"934d7b04-df5d-4a73-a650-45b970a7a96e\") " pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.367706 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.389476 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzmsb\" (UniqueName: \"kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb\") pod \"auto-csr-approver-29564030-bxv8t\" (UID: \"934d7b04-df5d-4a73-a650-45b970a7a96e\") " pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:00 crc kubenswrapper[4912]: I0318 13:50:00.504552 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:01 crc kubenswrapper[4912]: I0318 13:50:01.039071 4912 generic.go:334] "Generic (PLEG): container finished" podID="8aafa39c-882e-4992-a129-e77067a4862a" containerID="c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207" exitCode=0 Mar 18 13:50:01 crc kubenswrapper[4912]: I0318 13:50:01.039518 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerDied","Data":"c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207"} Mar 18 13:50:01 crc kubenswrapper[4912]: I0318 13:50:01.040292 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerStarted","Data":"c24d7bd27bf81492cb283fd8057b65f607345252e83f62f180d94e9e81191c20"} Mar 18 13:50:01 crc kubenswrapper[4912]: W0318 13:50:01.096818 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod934d7b04_df5d_4a73_a650_45b970a7a96e.slice/crio-8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8 WatchSource:0}: Error finding container 8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8: Status 404 returned error can't find the container with id 8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8 Mar 18 13:50:01 crc kubenswrapper[4912]: I0318 13:50:01.099106 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-bxv8t"] Mar 18 13:50:02 crc kubenswrapper[4912]: I0318 13:50:02.067990 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerStarted","Data":"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f"} Mar 18 13:50:02 crc kubenswrapper[4912]: I0318 13:50:02.073130 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" event={"ID":"934d7b04-df5d-4a73-a650-45b970a7a96e","Type":"ContainerStarted","Data":"8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8"} Mar 18 13:50:04 crc kubenswrapper[4912]: I0318 13:50:04.116677 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerStarted","Data":"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8"} Mar 18 13:50:04 crc kubenswrapper[4912]: I0318 13:50:04.119725 4912 generic.go:334] "Generic (PLEG): container finished" podID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerID="b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f" exitCode=0 Mar 18 13:50:04 crc kubenswrapper[4912]: I0318 13:50:04.119814 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerDied","Data":"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f"} Mar 18 13:50:04 crc kubenswrapper[4912]: I0318 13:50:04.122619 4912 generic.go:334] "Generic (PLEG): container finished" podID="934d7b04-df5d-4a73-a650-45b970a7a96e" containerID="b030573ec8e412bc973fc21737768a288f14e4513be75738a7d3680c16fefd7a" exitCode=0 Mar 18 13:50:04 crc kubenswrapper[4912]: I0318 13:50:04.122680 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" event={"ID":"934d7b04-df5d-4a73-a650-45b970a7a96e","Type":"ContainerDied","Data":"b030573ec8e412bc973fc21737768a288f14e4513be75738a7d3680c16fefd7a"} Mar 18 13:50:05 crc kubenswrapper[4912]: I0318 13:50:05.580475 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:05 crc kubenswrapper[4912]: I0318 13:50:05.660721 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzmsb\" (UniqueName: \"kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb\") pod \"934d7b04-df5d-4a73-a650-45b970a7a96e\" (UID: \"934d7b04-df5d-4a73-a650-45b970a7a96e\") " Mar 18 13:50:05 crc kubenswrapper[4912]: I0318 13:50:05.669873 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb" (OuterVolumeSpecName: "kube-api-access-mzmsb") pod "934d7b04-df5d-4a73-a650-45b970a7a96e" (UID: "934d7b04-df5d-4a73-a650-45b970a7a96e"). InnerVolumeSpecName "kube-api-access-mzmsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:05 crc kubenswrapper[4912]: I0318 13:50:05.765613 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzmsb\" (UniqueName: \"kubernetes.io/projected/934d7b04-df5d-4a73-a650-45b970a7a96e-kube-api-access-mzmsb\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:05 crc kubenswrapper[4912]: I0318 13:50:05.960582 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.149210 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" event={"ID":"934d7b04-df5d-4a73-a650-45b970a7a96e","Type":"ContainerDied","Data":"8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8"} Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.149268 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef3b0835254f4188d178a219540ee08d93aea5ba4df4d7440cb03be5c620ca8" Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.149222 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-bxv8t" Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.151729 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerStarted","Data":"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60"} Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.182861 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mtzsh" podStartSLOduration=4.284514338 podStartE2EDuration="9.182831265s" podCreationTimestamp="2026-03-18 13:49:57 +0000 UTC" firstStartedPulling="2026-03-18 13:49:59.976982445 +0000 UTC m=+2848.436409860" lastFinishedPulling="2026-03-18 13:50:04.875299362 +0000 UTC m=+2853.334726787" observedRunningTime="2026-03-18 13:50:06.172243649 +0000 UTC m=+2854.631671084" watchObservedRunningTime="2026-03-18 13:50:06.182831265 +0000 UTC m=+2854.642258690" Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.679542 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-vf9tx"] Mar 18 13:50:06 crc kubenswrapper[4912]: I0318 13:50:06.693633 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-vf9tx"] Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.245637 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cc1d26-0b9e-4b34-9280-848826e923c0" path="/var/lib/kubelet/pods/88cc1d26-0b9e-4b34-9280-848826e923c0/volumes" Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.386765 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.386928 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.410115 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.410466 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8d8f" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="registry-server" containerID="cri-o://cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df" gracePeriod=2 Mar 18 13:50:08 crc kubenswrapper[4912]: I0318 13:50:08.950000 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.078218 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl4bm\" (UniqueName: \"kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm\") pod \"f5041c12-6502-4d75-87dc-9c6367620d59\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.078424 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content\") pod \"f5041c12-6502-4d75-87dc-9c6367620d59\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.078638 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities\") pod \"f5041c12-6502-4d75-87dc-9c6367620d59\" (UID: \"f5041c12-6502-4d75-87dc-9c6367620d59\") " Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.079490 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities" (OuterVolumeSpecName: "utilities") pod "f5041c12-6502-4d75-87dc-9c6367620d59" (UID: "f5041c12-6502-4d75-87dc-9c6367620d59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.086182 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm" (OuterVolumeSpecName: "kube-api-access-bl4bm") pod "f5041c12-6502-4d75-87dc-9c6367620d59" (UID: "f5041c12-6502-4d75-87dc-9c6367620d59"). InnerVolumeSpecName "kube-api-access-bl4bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.134389 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5041c12-6502-4d75-87dc-9c6367620d59" (UID: "f5041c12-6502-4d75-87dc-9c6367620d59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.183575 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.183638 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl4bm\" (UniqueName: \"kubernetes.io/projected/f5041c12-6502-4d75-87dc-9c6367620d59-kube-api-access-bl4bm\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.183653 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5041c12-6502-4d75-87dc-9c6367620d59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.193210 4912 generic.go:334] "Generic (PLEG): container finished" podID="8aafa39c-882e-4992-a129-e77067a4862a" containerID="2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8" exitCode=0 Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.193283 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerDied","Data":"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8"} Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.196373 4912 generic.go:334] "Generic (PLEG): container finished" podID="f5041c12-6502-4d75-87dc-9c6367620d59" containerID="cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df" exitCode=0 Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.196424 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d8f" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.196478 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerDied","Data":"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df"} Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.196511 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d8f" event={"ID":"f5041c12-6502-4d75-87dc-9c6367620d59","Type":"ContainerDied","Data":"f8e132a3ca35a3df6f655443a163076af39123365a65bb649c599b1a0ad4cb48"} Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.196561 4912 scope.go:117] "RemoveContainer" containerID="cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.249880 4912 scope.go:117] "RemoveContainer" containerID="99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.261763 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.274143 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8d8f"] Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.285925 4912 scope.go:117] "RemoveContainer" containerID="aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.345623 4912 scope.go:117] "RemoveContainer" containerID="cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df" Mar 18 13:50:09 crc kubenswrapper[4912]: E0318 13:50:09.346174 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df\": container with ID starting with cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df not found: ID does not exist" containerID="cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.346235 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df"} err="failed to get container status \"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df\": rpc error: code = NotFound desc = could not find container \"cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df\": container with ID starting with cf0d2edc4caa341a0a99df5aafd26e47c12318962a9314aec710b751b12622df not found: ID does not exist" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.346267 4912 scope.go:117] "RemoveContainer" containerID="99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e" Mar 18 13:50:09 crc kubenswrapper[4912]: E0318 13:50:09.346630 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e\": container with ID starting with 99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e not found: ID does not exist" containerID="99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.346660 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e"} err="failed to get container status \"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e\": rpc error: code = NotFound desc = could not find container \"99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e\": container with ID starting with 99f4d320c95a8c6c4520d11f3a703b878262e62a886b53c1b83ba829f2cfc65e not found: ID does not exist" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.346676 4912 scope.go:117] "RemoveContainer" containerID="aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579" Mar 18 13:50:09 crc kubenswrapper[4912]: E0318 13:50:09.347111 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579\": container with ID starting with aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579 not found: ID does not exist" containerID="aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.347136 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579"} err="failed to get container status \"aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579\": rpc error: code = NotFound desc = could not find container \"aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579\": container with ID starting with aedec8c22bd5a2b6bab4f63f0d8d7c5613937ff21b9e8b41a7ea5fe559d81579 not found: ID does not exist" Mar 18 13:50:09 crc kubenswrapper[4912]: I0318 13:50:09.440860 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mtzsh" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="registry-server" probeResult="failure" output=< Mar 18 13:50:09 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:50:09 crc kubenswrapper[4912]: > Mar 18 13:50:10 crc kubenswrapper[4912]: I0318 13:50:10.216091 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerStarted","Data":"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8"} Mar 18 13:50:10 crc kubenswrapper[4912]: I0318 13:50:10.246538 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qgpjg" podStartSLOduration=2.649673697 podStartE2EDuration="11.246508886s" podCreationTimestamp="2026-03-18 13:49:59 +0000 UTC" firstStartedPulling="2026-03-18 13:50:01.041386518 +0000 UTC m=+2849.500813943" lastFinishedPulling="2026-03-18 13:50:09.638221707 +0000 UTC m=+2858.097649132" observedRunningTime="2026-03-18 13:50:10.237524794 +0000 UTC m=+2858.696952219" watchObservedRunningTime="2026-03-18 13:50:10.246508886 +0000 UTC m=+2858.705936331" Mar 18 13:50:10 crc kubenswrapper[4912]: I0318 13:50:10.260068 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" path="/var/lib/kubelet/pods/f5041c12-6502-4d75-87dc-9c6367620d59/volumes" Mar 18 13:50:18 crc kubenswrapper[4912]: I0318 13:50:18.439984 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:18 crc kubenswrapper[4912]: I0318 13:50:18.497726 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:18 crc kubenswrapper[4912]: I0318 13:50:18.685809 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:50:19 crc kubenswrapper[4912]: I0318 13:50:19.761667 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:19 crc kubenswrapper[4912]: I0318 13:50:19.761730 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:20 crc kubenswrapper[4912]: I0318 13:50:20.367560 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mtzsh" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="registry-server" containerID="cri-o://3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60" gracePeriod=2 Mar 18 13:50:20 crc kubenswrapper[4912]: I0318 13:50:20.816639 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qgpjg" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" probeResult="failure" output=< Mar 18 13:50:20 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:50:20 crc kubenswrapper[4912]: > Mar 18 13:50:20 crc kubenswrapper[4912]: I0318 13:50:20.985963 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.149842 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4l65\" (UniqueName: \"kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65\") pod \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.149909 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content\") pod \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.150487 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities\") pod \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\" (UID: \"f93abf87-9793-41b0-bd8d-d2d38ae319b0\") " Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.150960 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities" (OuterVolumeSpecName: "utilities") pod "f93abf87-9793-41b0-bd8d-d2d38ae319b0" (UID: "f93abf87-9793-41b0-bd8d-d2d38ae319b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.151931 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.157410 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65" (OuterVolumeSpecName: "kube-api-access-r4l65") pod "f93abf87-9793-41b0-bd8d-d2d38ae319b0" (UID: "f93abf87-9793-41b0-bd8d-d2d38ae319b0"). InnerVolumeSpecName "kube-api-access-r4l65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.215759 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f93abf87-9793-41b0-bd8d-d2d38ae319b0" (UID: "f93abf87-9793-41b0-bd8d-d2d38ae319b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.254819 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93abf87-9793-41b0-bd8d-d2d38ae319b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.254856 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4l65\" (UniqueName: \"kubernetes.io/projected/f93abf87-9793-41b0-bd8d-d2d38ae319b0-kube-api-access-r4l65\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.381866 4912 generic.go:334] "Generic (PLEG): container finished" podID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerID="3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60" exitCode=0 Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.381934 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtzsh" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.381935 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerDied","Data":"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60"} Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.382118 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtzsh" event={"ID":"f93abf87-9793-41b0-bd8d-d2d38ae319b0","Type":"ContainerDied","Data":"d32f2a4fcab9b8a50d8c847b113adeae4e676b5ffb53fcab0adbd8e6277c1471"} Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.382163 4912 scope.go:117] "RemoveContainer" containerID="3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.420601 4912 scope.go:117] "RemoveContainer" containerID="b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.421932 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.433306 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mtzsh"] Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.449551 4912 scope.go:117] "RemoveContainer" containerID="be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.508555 4912 scope.go:117] "RemoveContainer" containerID="3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60" Mar 18 13:50:21 crc kubenswrapper[4912]: E0318 13:50:21.509290 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60\": container with ID starting with 3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60 not found: ID does not exist" containerID="3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.509430 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60"} err="failed to get container status \"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60\": rpc error: code = NotFound desc = could not find container \"3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60\": container with ID starting with 3d80e64813b7eed3dc4dc04192075d9f2cab441862ae7fce1bba5a00d4405b60 not found: ID does not exist" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.509466 4912 scope.go:117] "RemoveContainer" containerID="b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f" Mar 18 13:50:21 crc kubenswrapper[4912]: E0318 13:50:21.509863 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f\": container with ID starting with b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f not found: ID does not exist" containerID="b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.509904 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f"} err="failed to get container status \"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f\": rpc error: code = NotFound desc = could not find container \"b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f\": container with ID starting with b34371940cedcf265b919b0b457310809a8e13e6b3787a07cee4ad85b0c7b56f not found: ID does not exist" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.509932 4912 scope.go:117] "RemoveContainer" containerID="be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc" Mar 18 13:50:21 crc kubenswrapper[4912]: E0318 13:50:21.510452 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc\": container with ID starting with be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc not found: ID does not exist" containerID="be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc" Mar 18 13:50:21 crc kubenswrapper[4912]: I0318 13:50:21.510482 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc"} err="failed to get container status \"be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc\": rpc error: code = NotFound desc = could not find container \"be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc\": container with ID starting with be63de5a0ac4ea0f7196cf18df63ec9ea3c55a3d3ff47f04d97d08f80e7f82cc not found: ID does not exist" Mar 18 13:50:22 crc kubenswrapper[4912]: I0318 13:50:22.245817 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" path="/var/lib/kubelet/pods/f93abf87-9793-41b0-bd8d-d2d38ae319b0/volumes" Mar 18 13:50:30 crc kubenswrapper[4912]: I0318 13:50:30.818617 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qgpjg" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" probeResult="failure" output=< Mar 18 13:50:30 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:50:30 crc kubenswrapper[4912]: > Mar 18 13:50:39 crc kubenswrapper[4912]: I0318 13:50:39.828169 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:39 crc kubenswrapper[4912]: I0318 13:50:39.887742 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:40 crc kubenswrapper[4912]: I0318 13:50:40.076499 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:50:41 crc kubenswrapper[4912]: I0318 13:50:41.642379 4912 generic.go:334] "Generic (PLEG): container finished" podID="3db96e35-5cad-42d1-afe8-bf48fa9ac92e" containerID="24fb9e286bc07b315b21d3bc12387dc0642c5cd319822853be74763612338876" exitCode=0 Mar 18 13:50:41 crc kubenswrapper[4912]: I0318 13:50:41.642469 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" event={"ID":"3db96e35-5cad-42d1-afe8-bf48fa9ac92e","Type":"ContainerDied","Data":"24fb9e286bc07b315b21d3bc12387dc0642c5cd319822853be74763612338876"} Mar 18 13:50:41 crc kubenswrapper[4912]: I0318 13:50:41.643432 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qgpjg" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" containerID="cri-o://85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8" gracePeriod=2 Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.239165 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.292586 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content\") pod \"8aafa39c-882e-4992-a129-e77067a4862a\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.292833 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities\") pod \"8aafa39c-882e-4992-a129-e77067a4862a\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.292918 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hjrk\" (UniqueName: \"kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk\") pod \"8aafa39c-882e-4992-a129-e77067a4862a\" (UID: \"8aafa39c-882e-4992-a129-e77067a4862a\") " Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.293890 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities" (OuterVolumeSpecName: "utilities") pod "8aafa39c-882e-4992-a129-e77067a4862a" (UID: "8aafa39c-882e-4992-a129-e77067a4862a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.300321 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk" (OuterVolumeSpecName: "kube-api-access-6hjrk") pod "8aafa39c-882e-4992-a129-e77067a4862a" (UID: "8aafa39c-882e-4992-a129-e77067a4862a"). InnerVolumeSpecName "kube-api-access-6hjrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.394985 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.395034 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hjrk\" (UniqueName: \"kubernetes.io/projected/8aafa39c-882e-4992-a129-e77067a4862a-kube-api-access-6hjrk\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.456801 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8aafa39c-882e-4992-a129-e77067a4862a" (UID: "8aafa39c-882e-4992-a129-e77067a4862a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.497456 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafa39c-882e-4992-a129-e77067a4862a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.659580 4912 generic.go:334] "Generic (PLEG): container finished" podID="8aafa39c-882e-4992-a129-e77067a4862a" containerID="85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8" exitCode=0 Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.659703 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgpjg" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.659703 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerDied","Data":"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8"} Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.659775 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgpjg" event={"ID":"8aafa39c-882e-4992-a129-e77067a4862a","Type":"ContainerDied","Data":"c24d7bd27bf81492cb283fd8057b65f607345252e83f62f180d94e9e81191c20"} Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.659805 4912 scope.go:117] "RemoveContainer" containerID="85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.724013 4912 scope.go:117] "RemoveContainer" containerID="2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.724899 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.743735 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qgpjg"] Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.766385 4912 scope.go:117] "RemoveContainer" containerID="c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.840359 4912 scope.go:117] "RemoveContainer" containerID="85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8" Mar 18 13:50:42 crc kubenswrapper[4912]: E0318 13:50:42.840803 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8\": container with ID starting with 85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8 not found: ID does not exist" containerID="85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.840845 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8"} err="failed to get container status \"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8\": rpc error: code = NotFound desc = could not find container \"85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8\": container with ID starting with 85e8398c30d1ac76904ad277778f7ab9d42cad94ef3a6c087ea7e9c202cb9da8 not found: ID does not exist" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.840878 4912 scope.go:117] "RemoveContainer" containerID="2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8" Mar 18 13:50:42 crc kubenswrapper[4912]: E0318 13:50:42.841332 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8\": container with ID starting with 2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8 not found: ID does not exist" containerID="2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.841395 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8"} err="failed to get container status \"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8\": rpc error: code = NotFound desc = could not find container \"2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8\": container with ID starting with 2f0210c774d5b0108c32460fe22642d31f2be9396936283aa0f1b0ede3b114f8 not found: ID does not exist" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.841439 4912 scope.go:117] "RemoveContainer" containerID="c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207" Mar 18 13:50:42 crc kubenswrapper[4912]: E0318 13:50:42.842583 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207\": container with ID starting with c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207 not found: ID does not exist" containerID="c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207" Mar 18 13:50:42 crc kubenswrapper[4912]: I0318 13:50:42.842632 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207"} err="failed to get container status \"c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207\": rpc error: code = NotFound desc = could not find container \"c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207\": container with ID starting with c698021ae9ba554f5494e37ebda1f0a34894dba77aee63e5a83f5be59c7d3207 not found: ID does not exist" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.305686 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426434 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426526 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426592 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426705 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426864 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426908 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.426972 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.427025 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.427084 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5286\" (UniqueName: \"kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.427208 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.427236 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle\") pod \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\" (UID: \"3db96e35-5cad-42d1-afe8-bf48fa9ac92e\") " Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.435380 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286" (OuterVolumeSpecName: "kube-api-access-v5286") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "kube-api-access-v5286". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.454778 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.470590 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.472903 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.473561 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.491633 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.495300 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.498584 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.504075 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory" (OuterVolumeSpecName: "inventory") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.506405 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.515346 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "3db96e35-5cad-42d1-afe8-bf48fa9ac92e" (UID: "3db96e35-5cad-42d1-afe8-bf48fa9ac92e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533618 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533671 4912 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533685 4912 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533698 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533711 4912 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533723 4912 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533731 4912 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533740 4912 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533749 4912 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533759 4912 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.533771 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5286\" (UniqueName: \"kubernetes.io/projected/3db96e35-5cad-42d1-afe8-bf48fa9ac92e-kube-api-access-v5286\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.684600 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.684597 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7rh7k" event={"ID":"3db96e35-5cad-42d1-afe8-bf48fa9ac92e","Type":"ContainerDied","Data":"7abeaaf1e860c899deded4acfd5de232e38ed4012ec489cfc3267181240d12be"} Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.684700 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7abeaaf1e860c899deded4acfd5de232e38ed4012ec489cfc3267181240d12be" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.808811 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7"] Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809692 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809721 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809748 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809759 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809769 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db96e35-5cad-42d1-afe8-bf48fa9ac92e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809778 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db96e35-5cad-42d1-afe8-bf48fa9ac92e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809802 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809811 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809838 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809863 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809902 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809912 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809925 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809933 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="extract-content" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809951 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934d7b04-df5d-4a73-a650-45b970a7a96e" containerName="oc" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809959 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="934d7b04-df5d-4a73-a650-45b970a7a96e" containerName="oc" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809969 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.809976 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.809997 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810005 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: E0318 13:50:43.810015 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810021 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="extract-utilities" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810354 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db96e35-5cad-42d1-afe8-bf48fa9ac92e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810375 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aafa39c-882e-4992-a129-e77067a4862a" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810395 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="934d7b04-df5d-4a73-a650-45b970a7a96e" containerName="oc" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810408 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93abf87-9793-41b0-bd8d-d2d38ae319b0" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.810428 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5041c12-6502-4d75-87dc-9c6367620d59" containerName="registry-server" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.811731 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.814633 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.815919 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.816033 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.817827 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.819469 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.822217 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7"] Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948283 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948390 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948498 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948543 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948636 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56c7\" (UniqueName: \"kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948775 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:43 crc kubenswrapper[4912]: I0318 13:50:43.948820 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.051863 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052481 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052638 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052699 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052819 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052860 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.052944 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56c7\" (UniqueName: \"kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.059282 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.059367 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.060171 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.060673 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.061235 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.062695 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.074446 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56c7\" (UniqueName: \"kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.212521 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.243031 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aafa39c-882e-4992-a129-e77067a4862a" path="/var/lib/kubelet/pods/8aafa39c-882e-4992-a129-e77067a4862a/volumes" Mar 18 13:50:44 crc kubenswrapper[4912]: I0318 13:50:44.820332 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7"] Mar 18 13:50:45 crc kubenswrapper[4912]: I0318 13:50:45.715335 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" event={"ID":"edc150d2-9448-4f82-a4a4-eeb5b0b06829","Type":"ContainerStarted","Data":"7c08c2e5ed9faedc506f8a817ab983900defef61b79ead66e06743b8d2d58fdd"} Mar 18 13:50:45 crc kubenswrapper[4912]: I0318 13:50:45.715834 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" event={"ID":"edc150d2-9448-4f82-a4a4-eeb5b0b06829","Type":"ContainerStarted","Data":"f36dbb62aa06e21c066310842386f6fea87503a20ed9a6b538c3c7476fd609be"} Mar 18 13:50:45 crc kubenswrapper[4912]: I0318 13:50:45.737178 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" podStartSLOduration=2.207570326 podStartE2EDuration="2.737157322s" podCreationTimestamp="2026-03-18 13:50:43 +0000 UTC" firstStartedPulling="2026-03-18 13:50:44.821916762 +0000 UTC m=+2893.281344187" lastFinishedPulling="2026-03-18 13:50:45.351503768 +0000 UTC m=+2893.810931183" observedRunningTime="2026-03-18 13:50:45.73525577 +0000 UTC m=+2894.194683215" watchObservedRunningTime="2026-03-18 13:50:45.737157322 +0000 UTC m=+2894.196584747" Mar 18 13:51:03 crc kubenswrapper[4912]: I0318 13:51:03.929618 4912 scope.go:117] "RemoveContainer" containerID="83f5672d1b9f04d1fb404e096c1ba071bd79aa1597a17f73e2e4c2216a59af25" Mar 18 13:51:36 crc kubenswrapper[4912]: I0318 13:51:36.999026 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:51:37 crc kubenswrapper[4912]: I0318 13:51:37.002385 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.192638 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564032-h5q9n"] Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.198327 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.202399 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.202597 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.203097 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.261349 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-h5q9n"] Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.299931 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n42zn\" (UniqueName: \"kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn\") pod \"auto-csr-approver-29564032-h5q9n\" (UID: \"80671080-7422-4d4a-a0ce-2a6ed977a0d1\") " pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.402746 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n42zn\" (UniqueName: \"kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn\") pod \"auto-csr-approver-29564032-h5q9n\" (UID: \"80671080-7422-4d4a-a0ce-2a6ed977a0d1\") " pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.424378 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n42zn\" (UniqueName: \"kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn\") pod \"auto-csr-approver-29564032-h5q9n\" (UID: \"80671080-7422-4d4a-a0ce-2a6ed977a0d1\") " pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:00 crc kubenswrapper[4912]: I0318 13:52:00.538750 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:01 crc kubenswrapper[4912]: I0318 13:52:01.068476 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-h5q9n"] Mar 18 13:52:01 crc kubenswrapper[4912]: I0318 13:52:01.669065 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" event={"ID":"80671080-7422-4d4a-a0ce-2a6ed977a0d1","Type":"ContainerStarted","Data":"1f6ddb3597c8b195156adc92d8428ee7cfe964001f9b0f6b8494c0b0935f9360"} Mar 18 13:52:02 crc kubenswrapper[4912]: I0318 13:52:02.686527 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" event={"ID":"80671080-7422-4d4a-a0ce-2a6ed977a0d1","Type":"ContainerStarted","Data":"3fd4236d383d388c39ae2b01f3ceb2440d208a5df9c57cecf9a26b05ffe23a14"} Mar 18 13:52:02 crc kubenswrapper[4912]: I0318 13:52:02.745000 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" podStartSLOduration=1.68027459 podStartE2EDuration="2.744969964s" podCreationTimestamp="2026-03-18 13:52:00 +0000 UTC" firstStartedPulling="2026-03-18 13:52:01.070722638 +0000 UTC m=+2969.530150063" lastFinishedPulling="2026-03-18 13:52:02.135418012 +0000 UTC m=+2970.594845437" observedRunningTime="2026-03-18 13:52:02.73364601 +0000 UTC m=+2971.193073445" watchObservedRunningTime="2026-03-18 13:52:02.744969964 +0000 UTC m=+2971.204397389" Mar 18 13:52:03 crc kubenswrapper[4912]: I0318 13:52:03.699994 4912 generic.go:334] "Generic (PLEG): container finished" podID="80671080-7422-4d4a-a0ce-2a6ed977a0d1" containerID="3fd4236d383d388c39ae2b01f3ceb2440d208a5df9c57cecf9a26b05ffe23a14" exitCode=0 Mar 18 13:52:03 crc kubenswrapper[4912]: I0318 13:52:03.700133 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" event={"ID":"80671080-7422-4d4a-a0ce-2a6ed977a0d1","Type":"ContainerDied","Data":"3fd4236d383d388c39ae2b01f3ceb2440d208a5df9c57cecf9a26b05ffe23a14"} Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.169745 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.345770 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-lrc4k"] Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.358624 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n42zn\" (UniqueName: \"kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn\") pod \"80671080-7422-4d4a-a0ce-2a6ed977a0d1\" (UID: \"80671080-7422-4d4a-a0ce-2a6ed977a0d1\") " Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.358786 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-lrc4k"] Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.368409 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn" (OuterVolumeSpecName: "kube-api-access-n42zn") pod "80671080-7422-4d4a-a0ce-2a6ed977a0d1" (UID: "80671080-7422-4d4a-a0ce-2a6ed977a0d1"). InnerVolumeSpecName "kube-api-access-n42zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.462470 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n42zn\" (UniqueName: \"kubernetes.io/projected/80671080-7422-4d4a-a0ce-2a6ed977a0d1-kube-api-access-n42zn\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.749163 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" event={"ID":"80671080-7422-4d4a-a0ce-2a6ed977a0d1","Type":"ContainerDied","Data":"1f6ddb3597c8b195156adc92d8428ee7cfe964001f9b0f6b8494c0b0935f9360"} Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.749223 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6ddb3597c8b195156adc92d8428ee7cfe964001f9b0f6b8494c0b0935f9360" Mar 18 13:52:05 crc kubenswrapper[4912]: I0318 13:52:05.749655 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-h5q9n" Mar 18 13:52:06 crc kubenswrapper[4912]: I0318 13:52:06.249466 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4c29c5-28a9-4dda-ab06-f018f3edf59c" path="/var/lib/kubelet/pods/ae4c29c5-28a9-4dda-ab06-f018f3edf59c/volumes" Mar 18 13:52:06 crc kubenswrapper[4912]: I0318 13:52:06.998659 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:52:06 crc kubenswrapper[4912]: I0318 13:52:06.999541 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:52:36 crc kubenswrapper[4912]: I0318 13:52:36.998363 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:52:36 crc kubenswrapper[4912]: I0318 13:52:36.998921 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:36.998977 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:37.000346 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:37.000407 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f" gracePeriod=600 Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:37.140557 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f" exitCode=0 Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:37.140594 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f"} Mar 18 13:52:37 crc kubenswrapper[4912]: I0318 13:52:37.140648 4912 scope.go:117] "RemoveContainer" containerID="9a072051a327862720dd4d1b4b33bcb468ba2c7093200b98f9b8a06ab85a3753" Mar 18 13:52:38 crc kubenswrapper[4912]: I0318 13:52:38.157548 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc"} Mar 18 13:53:04 crc kubenswrapper[4912]: I0318 13:53:04.131906 4912 scope.go:117] "RemoveContainer" containerID="c36dd6f5e632ffa18be867f240e33100d34ebd84fc166d4160017de4af57f4e6" Mar 18 13:53:07 crc kubenswrapper[4912]: I0318 13:53:07.539553 4912 generic.go:334] "Generic (PLEG): container finished" podID="edc150d2-9448-4f82-a4a4-eeb5b0b06829" containerID="7c08c2e5ed9faedc506f8a817ab983900defef61b79ead66e06743b8d2d58fdd" exitCode=0 Mar 18 13:53:07 crc kubenswrapper[4912]: I0318 13:53:07.539637 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" event={"ID":"edc150d2-9448-4f82-a4a4-eeb5b0b06829","Type":"ContainerDied","Data":"7c08c2e5ed9faedc506f8a817ab983900defef61b79ead66e06743b8d2d58fdd"} Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.106346 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.300845 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.300960 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.301090 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56c7\" (UniqueName: \"kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.301273 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.301308 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.301381 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.301547 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0\") pod \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\" (UID: \"edc150d2-9448-4f82-a4a4-eeb5b0b06829\") " Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.310379 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.310522 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7" (OuterVolumeSpecName: "kube-api-access-k56c7") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "kube-api-access-k56c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.345965 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.345997 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.346623 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.349939 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.360922 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory" (OuterVolumeSpecName: "inventory") pod "edc150d2-9448-4f82-a4a4-eeb5b0b06829" (UID: "edc150d2-9448-4f82-a4a4-eeb5b0b06829"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.405336 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.406713 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.406824 4912 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.406898 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.406966 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.407126 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/edc150d2-9448-4f82-a4a4-eeb5b0b06829-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.407194 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56c7\" (UniqueName: \"kubernetes.io/projected/edc150d2-9448-4f82-a4a4-eeb5b0b06829-kube-api-access-k56c7\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.569276 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" event={"ID":"edc150d2-9448-4f82-a4a4-eeb5b0b06829","Type":"ContainerDied","Data":"f36dbb62aa06e21c066310842386f6fea87503a20ed9a6b538c3c7476fd609be"} Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.569324 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f36dbb62aa06e21c066310842386f6fea87503a20ed9a6b538c3c7476fd609be" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.569411 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.699666 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm"] Mar 18 13:53:09 crc kubenswrapper[4912]: E0318 13:53:09.700432 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc150d2-9448-4f82-a4a4-eeb5b0b06829" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.700451 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc150d2-9448-4f82-a4a4-eeb5b0b06829" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 13:53:09 crc kubenswrapper[4912]: E0318 13:53:09.700493 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80671080-7422-4d4a-a0ce-2a6ed977a0d1" containerName="oc" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.700503 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="80671080-7422-4d4a-a0ce-2a6ed977a0d1" containerName="oc" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.700798 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc150d2-9448-4f82-a4a4-eeb5b0b06829" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.700817 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="80671080-7422-4d4a-a0ce-2a6ed977a0d1" containerName="oc" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.701914 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.706068 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.706706 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.706848 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.707332 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.708929 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.713510 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm"] Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.818539 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.818948 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.819213 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.819260 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.819882 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.819932 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhf5\" (UniqueName: \"kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.820074 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925114 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925237 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhf5\" (UniqueName: \"kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925472 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925676 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925791 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925896 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.925936 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.931989 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.931989 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.932076 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.935008 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.937500 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.949977 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:09 crc kubenswrapper[4912]: I0318 13:53:09.952500 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhf5\" (UniqueName: \"kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:10 crc kubenswrapper[4912]: I0318 13:53:10.032930 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:53:10 crc kubenswrapper[4912]: I0318 13:53:10.741552 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm"] Mar 18 13:53:11 crc kubenswrapper[4912]: I0318 13:53:11.618676 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" event={"ID":"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660","Type":"ContainerStarted","Data":"de1c550c02a00efa6acde2475b8da9ed55d160c0fdbb4bcf86e3017ab7826fc9"} Mar 18 13:53:11 crc kubenswrapper[4912]: I0318 13:53:11.619285 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" event={"ID":"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660","Type":"ContainerStarted","Data":"9c32d984106229862a087e44cd8d9115b4fbbdee890c1e84ffb0f757b6098ca7"} Mar 18 13:53:11 crc kubenswrapper[4912]: I0318 13:53:11.652107 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" podStartSLOduration=2.1473161259999998 podStartE2EDuration="2.652073601s" podCreationTimestamp="2026-03-18 13:53:09 +0000 UTC" firstStartedPulling="2026-03-18 13:53:10.7473716 +0000 UTC m=+3039.206799025" lastFinishedPulling="2026-03-18 13:53:11.252129075 +0000 UTC m=+3039.711556500" observedRunningTime="2026-03-18 13:53:11.645949656 +0000 UTC m=+3040.105377091" watchObservedRunningTime="2026-03-18 13:53:11.652073601 +0000 UTC m=+3040.111501036" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.153494 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564034-knq95"] Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.157378 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.162164 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.162806 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.163270 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.173194 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-knq95"] Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.211433 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bxk\" (UniqueName: \"kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk\") pod \"auto-csr-approver-29564034-knq95\" (UID: \"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7\") " pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.314399 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99bxk\" (UniqueName: \"kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk\") pod \"auto-csr-approver-29564034-knq95\" (UID: \"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7\") " pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.347506 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bxk\" (UniqueName: \"kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk\") pod \"auto-csr-approver-29564034-knq95\" (UID: \"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7\") " pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:00 crc kubenswrapper[4912]: I0318 13:54:00.526999 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:01 crc kubenswrapper[4912]: I0318 13:54:01.052385 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-knq95"] Mar 18 13:54:01 crc kubenswrapper[4912]: I0318 13:54:01.231930 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-knq95" event={"ID":"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7","Type":"ContainerStarted","Data":"9b2ecf1902eb8e52e48b83130545d77abfa929de5fb411672b70e5aa0cb654d2"} Mar 18 13:54:03 crc kubenswrapper[4912]: I0318 13:54:03.261471 4912 generic.go:334] "Generic (PLEG): container finished" podID="c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" containerID="7a656076ccfc09fe0621f552526bdc7d6f53cc400731e10a86c2a858fe166539" exitCode=0 Mar 18 13:54:03 crc kubenswrapper[4912]: I0318 13:54:03.261540 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-knq95" event={"ID":"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7","Type":"ContainerDied","Data":"7a656076ccfc09fe0621f552526bdc7d6f53cc400731e10a86c2a858fe166539"} Mar 18 13:54:04 crc kubenswrapper[4912]: I0318 13:54:04.703352 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:04 crc kubenswrapper[4912]: I0318 13:54:04.843437 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99bxk\" (UniqueName: \"kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk\") pod \"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7\" (UID: \"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7\") " Mar 18 13:54:04 crc kubenswrapper[4912]: I0318 13:54:04.851571 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk" (OuterVolumeSpecName: "kube-api-access-99bxk") pod "c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" (UID: "c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7"). InnerVolumeSpecName "kube-api-access-99bxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:54:04 crc kubenswrapper[4912]: I0318 13:54:04.946831 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99bxk\" (UniqueName: \"kubernetes.io/projected/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7-kube-api-access-99bxk\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:05 crc kubenswrapper[4912]: I0318 13:54:05.294137 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-knq95" Mar 18 13:54:05 crc kubenswrapper[4912]: I0318 13:54:05.294136 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-knq95" event={"ID":"c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7","Type":"ContainerDied","Data":"9b2ecf1902eb8e52e48b83130545d77abfa929de5fb411672b70e5aa0cb654d2"} Mar 18 13:54:05 crc kubenswrapper[4912]: I0318 13:54:05.294297 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2ecf1902eb8e52e48b83130545d77abfa929de5fb411672b70e5aa0cb654d2" Mar 18 13:54:05 crc kubenswrapper[4912]: I0318 13:54:05.800444 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-mq8nq"] Mar 18 13:54:05 crc kubenswrapper[4912]: I0318 13:54:05.837377 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-mq8nq"] Mar 18 13:54:06 crc kubenswrapper[4912]: I0318 13:54:06.241829 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038d624a-12c8-4de7-84b3-1d84c0e7ffe3" path="/var/lib/kubelet/pods/038d624a-12c8-4de7-84b3-1d84c0e7ffe3/volumes" Mar 18 13:55:04 crc kubenswrapper[4912]: I0318 13:55:04.274949 4912 scope.go:117] "RemoveContainer" containerID="7c94629f3dbe7e99df3ee1a3e0890393b9d65c17d9608e5f598113cb13d1160e" Mar 18 13:55:06 crc kubenswrapper[4912]: I0318 13:55:06.998668 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:55:07 crc kubenswrapper[4912]: I0318 13:55:06.999610 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:55:12 crc kubenswrapper[4912]: I0318 13:55:12.579978 4912 generic.go:334] "Generic (PLEG): container finished" podID="a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" containerID="de1c550c02a00efa6acde2475b8da9ed55d160c0fdbb4bcf86e3017ab7826fc9" exitCode=0 Mar 18 13:55:12 crc kubenswrapper[4912]: I0318 13:55:12.580075 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" event={"ID":"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660","Type":"ContainerDied","Data":"de1c550c02a00efa6acde2475b8da9ed55d160c0fdbb4bcf86e3017ab7826fc9"} Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.163788 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273336 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273503 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxhf5\" (UniqueName: \"kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273550 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273625 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273740 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.273794 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.274069 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2\") pod \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\" (UID: \"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660\") " Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.281597 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5" (OuterVolumeSpecName: "kube-api-access-gxhf5") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "kube-api-access-gxhf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.284060 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.314250 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.318205 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.319025 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.324281 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.338231 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory" (OuterVolumeSpecName: "inventory") pod "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" (UID: "a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379264 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379316 4912 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379337 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxhf5\" (UniqueName: \"kubernetes.io/projected/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-kube-api-access-gxhf5\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379349 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379368 4912 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379380 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.379394 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.604685 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" event={"ID":"a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660","Type":"ContainerDied","Data":"9c32d984106229862a087e44cd8d9115b4fbbdee890c1e84ffb0f757b6098ca7"} Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.604744 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c32d984106229862a087e44cd8d9115b4fbbdee890c1e84ffb0f757b6098ca7" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.604758 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.762993 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd"] Mar 18 13:55:14 crc kubenswrapper[4912]: E0318 13:55:14.764721 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.764827 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 13:55:14 crc kubenswrapper[4912]: E0318 13:55:14.764945 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" containerName="oc" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.765019 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" containerName="oc" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.765385 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.765514 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" containerName="oc" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.766812 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.776740 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd"] Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.804137 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.804181 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.804183 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.804213 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.804592 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fwztj" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.905943 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.906213 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.906568 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57x9b\" (UniqueName: \"kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.906664 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:14 crc kubenswrapper[4912]: I0318 13:55:14.907008 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.009181 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.009306 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.009384 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57x9b\" (UniqueName: \"kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.009422 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.009505 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.016986 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.017349 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.017540 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.018063 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.030634 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57x9b\" (UniqueName: \"kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvtqd\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.145094 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.936687 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd"] Mar 18 13:55:15 crc kubenswrapper[4912]: I0318 13:55:15.946215 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:55:16 crc kubenswrapper[4912]: I0318 13:55:16.637765 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" event={"ID":"555eb0bd-76ea-4584-b984-fcf3ee653fe9","Type":"ContainerStarted","Data":"b8d5dc286ba71c5242761b54d742e664ea23a657650d6af5ea84939a78c572b4"} Mar 18 13:55:17 crc kubenswrapper[4912]: I0318 13:55:17.658160 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" event={"ID":"555eb0bd-76ea-4584-b984-fcf3ee653fe9","Type":"ContainerStarted","Data":"7f5011607d9e6fd7808d1cadbeb54e8247e94912fc7c33476739484f432a4aa1"} Mar 18 13:55:17 crc kubenswrapper[4912]: I0318 13:55:17.692624 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" podStartSLOduration=3.075827531 podStartE2EDuration="3.692591158s" podCreationTimestamp="2026-03-18 13:55:14 +0000 UTC" firstStartedPulling="2026-03-18 13:55:15.945827871 +0000 UTC m=+3164.405255306" lastFinishedPulling="2026-03-18 13:55:16.562591478 +0000 UTC m=+3165.022018933" observedRunningTime="2026-03-18 13:55:17.682297691 +0000 UTC m=+3166.141725116" watchObservedRunningTime="2026-03-18 13:55:17.692591158 +0000 UTC m=+3166.152018583" Mar 18 13:55:31 crc kubenswrapper[4912]: I0318 13:55:31.830804 4912 generic.go:334] "Generic (PLEG): container finished" podID="555eb0bd-76ea-4584-b984-fcf3ee653fe9" containerID="7f5011607d9e6fd7808d1cadbeb54e8247e94912fc7c33476739484f432a4aa1" exitCode=0 Mar 18 13:55:31 crc kubenswrapper[4912]: I0318 13:55:31.830896 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" event={"ID":"555eb0bd-76ea-4584-b984-fcf3ee653fe9","Type":"ContainerDied","Data":"7f5011607d9e6fd7808d1cadbeb54e8247e94912fc7c33476739484f432a4aa1"} Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.338995 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.458849 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory\") pod \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.461149 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam\") pod \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.461273 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57x9b\" (UniqueName: \"kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b\") pod \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.461449 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0\") pod \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.461539 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1\") pod \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\" (UID: \"555eb0bd-76ea-4584-b984-fcf3ee653fe9\") " Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.480525 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b" (OuterVolumeSpecName: "kube-api-access-57x9b") pod "555eb0bd-76ea-4584-b984-fcf3ee653fe9" (UID: "555eb0bd-76ea-4584-b984-fcf3ee653fe9"). InnerVolumeSpecName "kube-api-access-57x9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.502083 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "555eb0bd-76ea-4584-b984-fcf3ee653fe9" (UID: "555eb0bd-76ea-4584-b984-fcf3ee653fe9"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.504787 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "555eb0bd-76ea-4584-b984-fcf3ee653fe9" (UID: "555eb0bd-76ea-4584-b984-fcf3ee653fe9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.505160 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "555eb0bd-76ea-4584-b984-fcf3ee653fe9" (UID: "555eb0bd-76ea-4584-b984-fcf3ee653fe9"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.508426 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory" (OuterVolumeSpecName: "inventory") pod "555eb0bd-76ea-4584-b984-fcf3ee653fe9" (UID: "555eb0bd-76ea-4584-b984-fcf3ee653fe9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.564377 4912 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.564442 4912 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.564470 4912 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.564493 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/555eb0bd-76ea-4584-b984-fcf3ee653fe9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.564517 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57x9b\" (UniqueName: \"kubernetes.io/projected/555eb0bd-76ea-4584-b984-fcf3ee653fe9-kube-api-access-57x9b\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.858608 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" event={"ID":"555eb0bd-76ea-4584-b984-fcf3ee653fe9","Type":"ContainerDied","Data":"b8d5dc286ba71c5242761b54d742e664ea23a657650d6af5ea84939a78c572b4"} Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.858660 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8d5dc286ba71c5242761b54d742e664ea23a657650d6af5ea84939a78c572b4" Mar 18 13:55:33 crc kubenswrapper[4912]: I0318 13:55:33.858696 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvtqd" Mar 18 13:55:36 crc kubenswrapper[4912]: I0318 13:55:36.998980 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:55:37 crc kubenswrapper[4912]: I0318 13:55:36.999897 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.171787 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564036-rdltz"] Mar 18 13:56:00 crc kubenswrapper[4912]: E0318 13:56:00.177212 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555eb0bd-76ea-4584-b984-fcf3ee653fe9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.177283 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="555eb0bd-76ea-4584-b984-fcf3ee653fe9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.177580 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="555eb0bd-76ea-4584-b984-fcf3ee653fe9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.178979 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.182879 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.182900 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.183202 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.185806 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-rdltz"] Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.296531 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndkd\" (UniqueName: \"kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd\") pod \"auto-csr-approver-29564036-rdltz\" (UID: \"248a7634-1c0c-4e94-bb24-8bf23e43b2de\") " pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.399334 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndkd\" (UniqueName: \"kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd\") pod \"auto-csr-approver-29564036-rdltz\" (UID: \"248a7634-1c0c-4e94-bb24-8bf23e43b2de\") " pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.428865 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndkd\" (UniqueName: \"kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd\") pod \"auto-csr-approver-29564036-rdltz\" (UID: \"248a7634-1c0c-4e94-bb24-8bf23e43b2de\") " pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:00 crc kubenswrapper[4912]: I0318 13:56:00.512792 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:01 crc kubenswrapper[4912]: I0318 13:56:01.032554 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-rdltz"] Mar 18 13:56:01 crc kubenswrapper[4912]: I0318 13:56:01.203196 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-rdltz" event={"ID":"248a7634-1c0c-4e94-bb24-8bf23e43b2de","Type":"ContainerStarted","Data":"ff871e8de3ca2af2e177ff0c22f77b3bde2bfbb3e0263ae551391d7344d01845"} Mar 18 13:56:03 crc kubenswrapper[4912]: I0318 13:56:03.237058 4912 generic.go:334] "Generic (PLEG): container finished" podID="248a7634-1c0c-4e94-bb24-8bf23e43b2de" containerID="f3563743bb77423ff5879fbb37c17e2a089e0fc3a5559727d5822053cfce3dff" exitCode=0 Mar 18 13:56:03 crc kubenswrapper[4912]: I0318 13:56:03.237151 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-rdltz" event={"ID":"248a7634-1c0c-4e94-bb24-8bf23e43b2de","Type":"ContainerDied","Data":"f3563743bb77423ff5879fbb37c17e2a089e0fc3a5559727d5822053cfce3dff"} Mar 18 13:56:04 crc kubenswrapper[4912]: I0318 13:56:04.736540 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:04 crc kubenswrapper[4912]: I0318 13:56:04.787534 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bndkd\" (UniqueName: \"kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd\") pod \"248a7634-1c0c-4e94-bb24-8bf23e43b2de\" (UID: \"248a7634-1c0c-4e94-bb24-8bf23e43b2de\") " Mar 18 13:56:04 crc kubenswrapper[4912]: I0318 13:56:04.795014 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd" (OuterVolumeSpecName: "kube-api-access-bndkd") pod "248a7634-1c0c-4e94-bb24-8bf23e43b2de" (UID: "248a7634-1c0c-4e94-bb24-8bf23e43b2de"). InnerVolumeSpecName "kube-api-access-bndkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:04 crc kubenswrapper[4912]: I0318 13:56:04.892490 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bndkd\" (UniqueName: \"kubernetes.io/projected/248a7634-1c0c-4e94-bb24-8bf23e43b2de-kube-api-access-bndkd\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:05 crc kubenswrapper[4912]: I0318 13:56:05.266488 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-rdltz" event={"ID":"248a7634-1c0c-4e94-bb24-8bf23e43b2de","Type":"ContainerDied","Data":"ff871e8de3ca2af2e177ff0c22f77b3bde2bfbb3e0263ae551391d7344d01845"} Mar 18 13:56:05 crc kubenswrapper[4912]: I0318 13:56:05.266545 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff871e8de3ca2af2e177ff0c22f77b3bde2bfbb3e0263ae551391d7344d01845" Mar 18 13:56:05 crc kubenswrapper[4912]: I0318 13:56:05.266653 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-rdltz" Mar 18 13:56:05 crc kubenswrapper[4912]: I0318 13:56:05.833073 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-bxv8t"] Mar 18 13:56:05 crc kubenswrapper[4912]: I0318 13:56:05.844066 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-bxv8t"] Mar 18 13:56:06 crc kubenswrapper[4912]: I0318 13:56:06.280327 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934d7b04-df5d-4a73-a650-45b970a7a96e" path="/var/lib/kubelet/pods/934d7b04-df5d-4a73-a650-45b970a7a96e/volumes" Mar 18 13:56:06 crc kubenswrapper[4912]: I0318 13:56:06.999286 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:06.999381 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:06.999439 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.000657 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.000735 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" gracePeriod=600 Mar 18 13:56:07 crc kubenswrapper[4912]: E0318 13:56:07.143436 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.293340 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" exitCode=0 Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.293404 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc"} Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.293464 4912 scope.go:117] "RemoveContainer" containerID="4c10d1c28ff97f0cc0a16b7461f0b81a04d3f6128153ac9f55e2d7ef187c416f" Mar 18 13:56:07 crc kubenswrapper[4912]: I0318 13:56:07.294629 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:56:07 crc kubenswrapper[4912]: E0318 13:56:07.295006 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:56:21 crc kubenswrapper[4912]: I0318 13:56:21.242833 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:56:21 crc kubenswrapper[4912]: E0318 13:56:21.244898 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:56:32 crc kubenswrapper[4912]: I0318 13:56:32.472473 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:56:32 crc kubenswrapper[4912]: I0318 13:56:32.479277 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:56:32 crc kubenswrapper[4912]: I0318 13:56:32.472704 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:56:32 crc kubenswrapper[4912]: I0318 13:56:32.479627 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:56:36 crc kubenswrapper[4912]: I0318 13:56:36.231755 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:56:36 crc kubenswrapper[4912]: E0318 13:56:36.232708 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:56:47 crc kubenswrapper[4912]: I0318 13:56:47.229098 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:56:47 crc kubenswrapper[4912]: E0318 13:56:47.230573 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:56:59 crc kubenswrapper[4912]: I0318 13:56:59.228159 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:56:59 crc kubenswrapper[4912]: E0318 13:56:59.229470 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:57:04 crc kubenswrapper[4912]: I0318 13:57:04.508226 4912 scope.go:117] "RemoveContainer" containerID="b030573ec8e412bc973fc21737768a288f14e4513be75738a7d3680c16fefd7a" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.706731 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:05 crc kubenswrapper[4912]: E0318 13:57:05.709171 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248a7634-1c0c-4e94-bb24-8bf23e43b2de" containerName="oc" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.709270 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="248a7634-1c0c-4e94-bb24-8bf23e43b2de" containerName="oc" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.709896 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="248a7634-1c0c-4e94-bb24-8bf23e43b2de" containerName="oc" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.712300 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.732758 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.798578 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.799102 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzfp4\" (UniqueName: \"kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.799520 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.902742 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.902821 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzfp4\" (UniqueName: \"kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.902950 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.903540 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.903542 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:05 crc kubenswrapper[4912]: I0318 13:57:05.935298 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzfp4\" (UniqueName: \"kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4\") pod \"redhat-marketplace-bw8zf\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:06 crc kubenswrapper[4912]: I0318 13:57:06.045697 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:06 crc kubenswrapper[4912]: I0318 13:57:06.667018 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:07 crc kubenswrapper[4912]: I0318 13:57:07.082600 4912 generic.go:334] "Generic (PLEG): container finished" podID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerID="ae783533f3ab69539719831d71e0f769bf878b5ae809d76da9dbadf0b068f476" exitCode=0 Mar 18 13:57:07 crc kubenswrapper[4912]: I0318 13:57:07.082700 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerDied","Data":"ae783533f3ab69539719831d71e0f769bf878b5ae809d76da9dbadf0b068f476"} Mar 18 13:57:07 crc kubenswrapper[4912]: I0318 13:57:07.083011 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerStarted","Data":"385094aba7b811c53494723dcff9c59f4181fb96f59a5973d2a4f9de38e33114"} Mar 18 13:57:08 crc kubenswrapper[4912]: I0318 13:57:08.099357 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerStarted","Data":"90acbdbcd2f3ba6cac8687214cacee152782d79a71086d79d8872ab92b6e756b"} Mar 18 13:57:10 crc kubenswrapper[4912]: I0318 13:57:10.126568 4912 generic.go:334] "Generic (PLEG): container finished" podID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerID="90acbdbcd2f3ba6cac8687214cacee152782d79a71086d79d8872ab92b6e756b" exitCode=0 Mar 18 13:57:10 crc kubenswrapper[4912]: I0318 13:57:10.126644 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerDied","Data":"90acbdbcd2f3ba6cac8687214cacee152782d79a71086d79d8872ab92b6e756b"} Mar 18 13:57:11 crc kubenswrapper[4912]: I0318 13:57:11.146325 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerStarted","Data":"b96302fb1030eaf4f6ab5d14209fa2adb1eb747b8928489ae4d7d7615149ff8e"} Mar 18 13:57:11 crc kubenswrapper[4912]: I0318 13:57:11.171886 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw8zf" podStartSLOduration=2.422611319 podStartE2EDuration="6.171860239s" podCreationTimestamp="2026-03-18 13:57:05 +0000 UTC" firstStartedPulling="2026-03-18 13:57:07.085735769 +0000 UTC m=+3275.545163194" lastFinishedPulling="2026-03-18 13:57:10.834984699 +0000 UTC m=+3279.294412114" observedRunningTime="2026-03-18 13:57:11.166019852 +0000 UTC m=+3279.625447287" watchObservedRunningTime="2026-03-18 13:57:11.171860239 +0000 UTC m=+3279.631287664" Mar 18 13:57:14 crc kubenswrapper[4912]: I0318 13:57:14.228292 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:57:14 crc kubenswrapper[4912]: E0318 13:57:14.229582 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:57:16 crc kubenswrapper[4912]: I0318 13:57:16.046453 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:16 crc kubenswrapper[4912]: I0318 13:57:16.047427 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:17 crc kubenswrapper[4912]: I0318 13:57:17.146268 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bw8zf" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="registry-server" probeResult="failure" output=< Mar 18 13:57:17 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 13:57:17 crc kubenswrapper[4912]: > Mar 18 13:57:25 crc kubenswrapper[4912]: I0318 13:57:25.228444 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:57:25 crc kubenswrapper[4912]: E0318 13:57:25.229899 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:57:26 crc kubenswrapper[4912]: I0318 13:57:26.100777 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:26 crc kubenswrapper[4912]: I0318 13:57:26.173478 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:26 crc kubenswrapper[4912]: I0318 13:57:26.356270 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:27 crc kubenswrapper[4912]: I0318 13:57:27.328349 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bw8zf" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="registry-server" containerID="cri-o://b96302fb1030eaf4f6ab5d14209fa2adb1eb747b8928489ae4d7d7615149ff8e" gracePeriod=2 Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.365136 4912 generic.go:334] "Generic (PLEG): container finished" podID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerID="b96302fb1030eaf4f6ab5d14209fa2adb1eb747b8928489ae4d7d7615149ff8e" exitCode=0 Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.365662 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerDied","Data":"b96302fb1030eaf4f6ab5d14209fa2adb1eb747b8928489ae4d7d7615149ff8e"} Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.748859 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.853620 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content\") pod \"61a5522e-a146-4a12-adf7-0320bacf8f26\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.853928 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzfp4\" (UniqueName: \"kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4\") pod \"61a5522e-a146-4a12-adf7-0320bacf8f26\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.853977 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities\") pod \"61a5522e-a146-4a12-adf7-0320bacf8f26\" (UID: \"61a5522e-a146-4a12-adf7-0320bacf8f26\") " Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.854630 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities" (OuterVolumeSpecName: "utilities") pod "61a5522e-a146-4a12-adf7-0320bacf8f26" (UID: "61a5522e-a146-4a12-adf7-0320bacf8f26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.855209 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.863323 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4" (OuterVolumeSpecName: "kube-api-access-kzfp4") pod "61a5522e-a146-4a12-adf7-0320bacf8f26" (UID: "61a5522e-a146-4a12-adf7-0320bacf8f26"). InnerVolumeSpecName "kube-api-access-kzfp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.887652 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61a5522e-a146-4a12-adf7-0320bacf8f26" (UID: "61a5522e-a146-4a12-adf7-0320bacf8f26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.957202 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a5522e-a146-4a12-adf7-0320bacf8f26-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:28 crc kubenswrapper[4912]: I0318 13:57:28.957452 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzfp4\" (UniqueName: \"kubernetes.io/projected/61a5522e-a146-4a12-adf7-0320bacf8f26-kube-api-access-kzfp4\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.379356 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zf" event={"ID":"61a5522e-a146-4a12-adf7-0320bacf8f26","Type":"ContainerDied","Data":"385094aba7b811c53494723dcff9c59f4181fb96f59a5973d2a4f9de38e33114"} Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.379423 4912 scope.go:117] "RemoveContainer" containerID="b96302fb1030eaf4f6ab5d14209fa2adb1eb747b8928489ae4d7d7615149ff8e" Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.379427 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zf" Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.418618 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.418950 4912 scope.go:117] "RemoveContainer" containerID="90acbdbcd2f3ba6cac8687214cacee152782d79a71086d79d8872ab92b6e756b" Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.436500 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zf"] Mar 18 13:57:29 crc kubenswrapper[4912]: I0318 13:57:29.456305 4912 scope.go:117] "RemoveContainer" containerID="ae783533f3ab69539719831d71e0f769bf878b5ae809d76da9dbadf0b068f476" Mar 18 13:57:30 crc kubenswrapper[4912]: I0318 13:57:30.243584 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" path="/var/lib/kubelet/pods/61a5522e-a146-4a12-adf7-0320bacf8f26/volumes" Mar 18 13:57:38 crc kubenswrapper[4912]: I0318 13:57:38.228438 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:57:38 crc kubenswrapper[4912]: E0318 13:57:38.229870 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:57:49 crc kubenswrapper[4912]: I0318 13:57:49.232580 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:57:49 crc kubenswrapper[4912]: E0318 13:57:49.233738 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.155756 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564038-vgjgt"] Mar 18 13:58:00 crc kubenswrapper[4912]: E0318 13:58:00.157449 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="extract-utilities" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.157468 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="extract-utilities" Mar 18 13:58:00 crc kubenswrapper[4912]: E0318 13:58:00.157492 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="extract-content" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.157503 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="extract-content" Mar 18 13:58:00 crc kubenswrapper[4912]: E0318 13:58:00.157539 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="registry-server" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.157548 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="registry-server" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.157927 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a5522e-a146-4a12-adf7-0320bacf8f26" containerName="registry-server" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.159279 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.162892 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.163786 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.164310 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.170238 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-vgjgt"] Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.261941 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5wv\" (UniqueName: \"kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv\") pod \"auto-csr-approver-29564038-vgjgt\" (UID: \"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c\") " pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.365021 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5wv\" (UniqueName: \"kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv\") pod \"auto-csr-approver-29564038-vgjgt\" (UID: \"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c\") " pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.388112 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5wv\" (UniqueName: \"kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv\") pod \"auto-csr-approver-29564038-vgjgt\" (UID: \"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c\") " pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:00 crc kubenswrapper[4912]: I0318 13:58:00.484799 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:01 crc kubenswrapper[4912]: I0318 13:58:00.993586 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-vgjgt"] Mar 18 13:58:01 crc kubenswrapper[4912]: I0318 13:58:01.228436 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:58:01 crc kubenswrapper[4912]: E0318 13:58:01.228931 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:58:01 crc kubenswrapper[4912]: I0318 13:58:01.807133 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" event={"ID":"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c","Type":"ContainerStarted","Data":"732ad26c1aea3139d99afc2e1008aea162933e4fca5964e0e2d2a24602b0da9a"} Mar 18 13:58:02 crc kubenswrapper[4912]: I0318 13:58:02.824777 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" event={"ID":"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c","Type":"ContainerStarted","Data":"d5295277e69ed839713a9bef40b446aec53e72fefeea4825c893a54ea0cd42a2"} Mar 18 13:58:02 crc kubenswrapper[4912]: I0318 13:58:02.847716 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" podStartSLOduration=1.49362968 podStartE2EDuration="2.847681536s" podCreationTimestamp="2026-03-18 13:58:00 +0000 UTC" firstStartedPulling="2026-03-18 13:58:00.996934972 +0000 UTC m=+3329.456362397" lastFinishedPulling="2026-03-18 13:58:02.350986828 +0000 UTC m=+3330.810414253" observedRunningTime="2026-03-18 13:58:02.845265451 +0000 UTC m=+3331.304692896" watchObservedRunningTime="2026-03-18 13:58:02.847681536 +0000 UTC m=+3331.307108971" Mar 18 13:58:03 crc kubenswrapper[4912]: I0318 13:58:03.840612 4912 generic.go:334] "Generic (PLEG): container finished" podID="9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" containerID="d5295277e69ed839713a9bef40b446aec53e72fefeea4825c893a54ea0cd42a2" exitCode=0 Mar 18 13:58:03 crc kubenswrapper[4912]: I0318 13:58:03.841114 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" event={"ID":"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c","Type":"ContainerDied","Data":"d5295277e69ed839713a9bef40b446aec53e72fefeea4825c893a54ea0cd42a2"} Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.319650 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.387519 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5wv\" (UniqueName: \"kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv\") pod \"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c\" (UID: \"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c\") " Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.421285 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv" (OuterVolumeSpecName: "kube-api-access-bx5wv") pod "9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" (UID: "9ba93aeb-6b96-4a8f-a5e7-8f426adb327c"). InnerVolumeSpecName "kube-api-access-bx5wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.492809 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5wv\" (UniqueName: \"kubernetes.io/projected/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c-kube-api-access-bx5wv\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.868262 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" event={"ID":"9ba93aeb-6b96-4a8f-a5e7-8f426adb327c","Type":"ContainerDied","Data":"732ad26c1aea3139d99afc2e1008aea162933e4fca5964e0e2d2a24602b0da9a"} Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.868315 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732ad26c1aea3139d99afc2e1008aea162933e4fca5964e0e2d2a24602b0da9a" Mar 18 13:58:05 crc kubenswrapper[4912]: I0318 13:58:05.868347 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-vgjgt" Mar 18 13:58:06 crc kubenswrapper[4912]: I0318 13:58:06.415268 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-h5q9n"] Mar 18 13:58:06 crc kubenswrapper[4912]: I0318 13:58:06.427932 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-h5q9n"] Mar 18 13:58:08 crc kubenswrapper[4912]: I0318 13:58:08.244571 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80671080-7422-4d4a-a0ce-2a6ed977a0d1" path="/var/lib/kubelet/pods/80671080-7422-4d4a-a0ce-2a6ed977a0d1/volumes" Mar 18 13:58:14 crc kubenswrapper[4912]: I0318 13:58:14.229057 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:58:14 crc kubenswrapper[4912]: E0318 13:58:14.231631 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:58:25 crc kubenswrapper[4912]: I0318 13:58:25.228310 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:58:25 crc kubenswrapper[4912]: E0318 13:58:25.229541 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:58:37 crc kubenswrapper[4912]: I0318 13:58:37.229342 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:58:37 crc kubenswrapper[4912]: E0318 13:58:37.230900 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:58:51 crc kubenswrapper[4912]: I0318 13:58:51.228459 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:58:51 crc kubenswrapper[4912]: E0318 13:58:51.231082 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:59:02 crc kubenswrapper[4912]: I0318 13:59:02.242235 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:59:02 crc kubenswrapper[4912]: E0318 13:59:02.243265 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:59:04 crc kubenswrapper[4912]: I0318 13:59:04.691768 4912 scope.go:117] "RemoveContainer" containerID="3fd4236d383d388c39ae2b01f3ceb2440d208a5df9c57cecf9a26b05ffe23a14" Mar 18 13:59:16 crc kubenswrapper[4912]: I0318 13:59:16.228368 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:59:16 crc kubenswrapper[4912]: E0318 13:59:16.229178 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:59:31 crc kubenswrapper[4912]: I0318 13:59:31.228804 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:59:31 crc kubenswrapper[4912]: E0318 13:59:31.229897 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:59:43 crc kubenswrapper[4912]: I0318 13:59:43.228165 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:59:43 crc kubenswrapper[4912]: E0318 13:59:43.229295 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 13:59:54 crc kubenswrapper[4912]: I0318 13:59:54.228109 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 13:59:54 crc kubenswrapper[4912]: E0318 13:59:54.229173 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.211383 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws"] Mar 18 14:00:00 crc kubenswrapper[4912]: E0318 14:00:00.212984 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" containerName="oc" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.213008 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" containerName="oc" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.213404 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" containerName="oc" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.214548 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.221711 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.228253 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.310316 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564040-kllnj"] Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.313586 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.313662 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.313769 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pht9l\" (UniqueName: \"kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.330858 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-kllnj"] Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.330909 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws"] Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.331013 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.350705 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.350758 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.351080 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.417237 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.417295 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.417323 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfgx5\" (UniqueName: \"kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5\") pod \"auto-csr-approver-29564040-kllnj\" (UID: \"645d1335-bcd3-4833-9d80-3d7b3403ce39\") " pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.417401 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pht9l\" (UniqueName: \"kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.419521 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.443564 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.444694 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pht9l\" (UniqueName: \"kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l\") pod \"collect-profiles-29564040-rbwws\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.519743 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgx5\" (UniqueName: \"kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5\") pod \"auto-csr-approver-29564040-kllnj\" (UID: \"645d1335-bcd3-4833-9d80-3d7b3403ce39\") " pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.541313 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgx5\" (UniqueName: \"kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5\") pod \"auto-csr-approver-29564040-kllnj\" (UID: \"645d1335-bcd3-4833-9d80-3d7b3403ce39\") " pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.595909 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:00 crc kubenswrapper[4912]: I0318 14:00:00.664134 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.172505 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws"] Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.345379 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-kllnj"] Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.855777 4912 generic.go:334] "Generic (PLEG): container finished" podID="0b3bae79-5c08-4c38-9556-fd9bb60145fe" containerID="5a4fbfb48af76d73d4161e9740ff8a25cf1e822de6d3875f411d3b71709b0446" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.855862 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" event={"ID":"0b3bae79-5c08-4c38-9556-fd9bb60145fe","Type":"ContainerDied","Data":"5a4fbfb48af76d73d4161e9740ff8a25cf1e822de6d3875f411d3b71709b0446"} Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.856271 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" event={"ID":"0b3bae79-5c08-4c38-9556-fd9bb60145fe","Type":"ContainerStarted","Data":"46ec40b70cc3d4ebba64bafaa3581ddf72fb7a9a623d806d1d020989f2752fdc"} Mar 18 14:00:01 crc kubenswrapper[4912]: I0318 14:00:01.858204 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-kllnj" event={"ID":"645d1335-bcd3-4833-9d80-3d7b3403ce39","Type":"ContainerStarted","Data":"7f7a0cce2f488dbe1ef63cebe0833ce63f9d93324b714b2ace7b3d8e205ab14f"} Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.362434 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.426574 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume\") pod \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.426737 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pht9l\" (UniqueName: \"kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l\") pod \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.426939 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume\") pod \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\" (UID: \"0b3bae79-5c08-4c38-9556-fd9bb60145fe\") " Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.428775 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b3bae79-5c08-4c38-9556-fd9bb60145fe" (UID: "0b3bae79-5c08-4c38-9556-fd9bb60145fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.440253 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b3bae79-5c08-4c38-9556-fd9bb60145fe" (UID: "0b3bae79-5c08-4c38-9556-fd9bb60145fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.442424 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l" (OuterVolumeSpecName: "kube-api-access-pht9l") pod "0b3bae79-5c08-4c38-9556-fd9bb60145fe" (UID: "0b3bae79-5c08-4c38-9556-fd9bb60145fe"). InnerVolumeSpecName "kube-api-access-pht9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.530512 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pht9l\" (UniqueName: \"kubernetes.io/projected/0b3bae79-5c08-4c38-9556-fd9bb60145fe-kube-api-access-pht9l\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.530569 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b3bae79-5c08-4c38-9556-fd9bb60145fe-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.530580 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b3bae79-5c08-4c38-9556-fd9bb60145fe-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.882314 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" event={"ID":"0b3bae79-5c08-4c38-9556-fd9bb60145fe","Type":"ContainerDied","Data":"46ec40b70cc3d4ebba64bafaa3581ddf72fb7a9a623d806d1d020989f2752fdc"} Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.882664 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ec40b70cc3d4ebba64bafaa3581ddf72fb7a9a623d806d1d020989f2752fdc" Mar 18 14:00:03 crc kubenswrapper[4912]: I0318 14:00:03.882376 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-rbwws" Mar 18 14:00:04 crc kubenswrapper[4912]: I0318 14:00:04.475479 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4"] Mar 18 14:00:04 crc kubenswrapper[4912]: I0318 14:00:04.488226 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-46cl4"] Mar 18 14:00:05 crc kubenswrapper[4912]: I0318 14:00:05.912408 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-kllnj" event={"ID":"645d1335-bcd3-4833-9d80-3d7b3403ce39","Type":"ContainerStarted","Data":"5a16dbd89408fd77e1fc00542c324a222d7e32594c482115d04bd7867895a63d"} Mar 18 14:00:05 crc kubenswrapper[4912]: I0318 14:00:05.935777 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564040-kllnj" podStartSLOduration=1.9395174929999999 podStartE2EDuration="5.935745224s" podCreationTimestamp="2026-03-18 14:00:00 +0000 UTC" firstStartedPulling="2026-03-18 14:00:01.350658353 +0000 UTC m=+3449.810085778" lastFinishedPulling="2026-03-18 14:00:05.346886074 +0000 UTC m=+3453.806313509" observedRunningTime="2026-03-18 14:00:05.929817744 +0000 UTC m=+3454.389245159" watchObservedRunningTime="2026-03-18 14:00:05.935745224 +0000 UTC m=+3454.395172650" Mar 18 14:00:06 crc kubenswrapper[4912]: I0318 14:00:06.228955 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:00:06 crc kubenswrapper[4912]: E0318 14:00:06.229704 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:00:06 crc kubenswrapper[4912]: I0318 14:00:06.245334 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fdc363-52c1-4525-aeaf-146ab5700fb3" path="/var/lib/kubelet/pods/23fdc363-52c1-4525-aeaf-146ab5700fb3/volumes" Mar 18 14:00:06 crc kubenswrapper[4912]: I0318 14:00:06.925479 4912 generic.go:334] "Generic (PLEG): container finished" podID="645d1335-bcd3-4833-9d80-3d7b3403ce39" containerID="5a16dbd89408fd77e1fc00542c324a222d7e32594c482115d04bd7867895a63d" exitCode=0 Mar 18 14:00:06 crc kubenswrapper[4912]: I0318 14:00:06.925546 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-kllnj" event={"ID":"645d1335-bcd3-4833-9d80-3d7b3403ce39","Type":"ContainerDied","Data":"5a16dbd89408fd77e1fc00542c324a222d7e32594c482115d04bd7867895a63d"} Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.413580 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.492847 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfgx5\" (UniqueName: \"kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5\") pod \"645d1335-bcd3-4833-9d80-3d7b3403ce39\" (UID: \"645d1335-bcd3-4833-9d80-3d7b3403ce39\") " Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.505380 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5" (OuterVolumeSpecName: "kube-api-access-zfgx5") pod "645d1335-bcd3-4833-9d80-3d7b3403ce39" (UID: "645d1335-bcd3-4833-9d80-3d7b3403ce39"). InnerVolumeSpecName "kube-api-access-zfgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.597640 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfgx5\" (UniqueName: \"kubernetes.io/projected/645d1335-bcd3-4833-9d80-3d7b3403ce39-kube-api-access-zfgx5\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.950800 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-kllnj" event={"ID":"645d1335-bcd3-4833-9d80-3d7b3403ce39","Type":"ContainerDied","Data":"7f7a0cce2f488dbe1ef63cebe0833ce63f9d93324b714b2ace7b3d8e205ab14f"} Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.950851 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-kllnj" Mar 18 14:00:08 crc kubenswrapper[4912]: I0318 14:00:08.951669 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f7a0cce2f488dbe1ef63cebe0833ce63f9d93324b714b2ace7b3d8e205ab14f" Mar 18 14:00:09 crc kubenswrapper[4912]: I0318 14:00:09.007319 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-knq95"] Mar 18 14:00:09 crc kubenswrapper[4912]: I0318 14:00:09.021663 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-knq95"] Mar 18 14:00:10 crc kubenswrapper[4912]: I0318 14:00:10.250089 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7" path="/var/lib/kubelet/pods/c2bbfaa0-5d73-4b69-bf43-38da8dd7d3b7/volumes" Mar 18 14:00:18 crc kubenswrapper[4912]: I0318 14:00:18.228385 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:00:18 crc kubenswrapper[4912]: E0318 14:00:18.229659 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:00:32 crc kubenswrapper[4912]: I0318 14:00:32.256809 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:00:32 crc kubenswrapper[4912]: E0318 14:00:32.258916 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.228607 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:00:47 crc kubenswrapper[4912]: E0318 14:00:47.230093 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.756132 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:00:47 crc kubenswrapper[4912]: E0318 14:00:47.757105 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645d1335-bcd3-4833-9d80-3d7b3403ce39" containerName="oc" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.757131 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="645d1335-bcd3-4833-9d80-3d7b3403ce39" containerName="oc" Mar 18 14:00:47 crc kubenswrapper[4912]: E0318 14:00:47.757157 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3bae79-5c08-4c38-9556-fd9bb60145fe" containerName="collect-profiles" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.757166 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3bae79-5c08-4c38-9556-fd9bb60145fe" containerName="collect-profiles" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.757558 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3bae79-5c08-4c38-9556-fd9bb60145fe" containerName="collect-profiles" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.757608 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="645d1335-bcd3-4833-9d80-3d7b3403ce39" containerName="oc" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.760968 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.769511 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.817882 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wnb\" (UniqueName: \"kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.818119 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.818286 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.922603 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wnb\" (UniqueName: \"kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.922749 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.922866 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.923596 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.923669 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.948159 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wnb\" (UniqueName: \"kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb\") pod \"community-operators-xzjsd\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.963687 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.969100 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:47 crc kubenswrapper[4912]: I0318 14:00:47.984768 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.099779 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.131075 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrlm\" (UniqueName: \"kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.131147 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.131524 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.235407 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.235761 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrlm\" (UniqueName: \"kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.235800 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.236755 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.237104 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.288300 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrlm\" (UniqueName: \"kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm\") pod \"certified-operators-pjp25\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.348096 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:48 crc kubenswrapper[4912]: I0318 14:00:48.916813 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.126288 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.602134 4912 generic.go:334] "Generic (PLEG): container finished" podID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerID="a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7" exitCode=0 Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.602234 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerDied","Data":"a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7"} Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.603032 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerStarted","Data":"0843a9b7a17b0b60eccc2701d0e2133430d4e28b925474021fba9effb4801612"} Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.605327 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.610366 4912 generic.go:334] "Generic (PLEG): container finished" podID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerID="06f435ce9d566275332ad0312e2ba5e284e87c8a0a647afc0d4144d9423068bc" exitCode=0 Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.610415 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerDied","Data":"06f435ce9d566275332ad0312e2ba5e284e87c8a0a647afc0d4144d9423068bc"} Mar 18 14:00:49 crc kubenswrapper[4912]: I0318 14:00:49.610439 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerStarted","Data":"fafdef5a9530fd96fc4ea702dd17c566e366fdbb2ba0499ab0a7fe6a1f83d6ff"} Mar 18 14:00:50 crc kubenswrapper[4912]: I0318 14:00:50.976371 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:00:50 crc kubenswrapper[4912]: I0318 14:00:50.983633 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:50 crc kubenswrapper[4912]: I0318 14:00:50.993555 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.070404 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.070729 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtml\" (UniqueName: \"kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.070869 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.173524 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtml\" (UniqueName: \"kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.173637 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.173801 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.174429 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.174551 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.199514 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtml\" (UniqueName: \"kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml\") pod \"redhat-operators-mphx5\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.380215 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.645155 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerStarted","Data":"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c"} Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.658889 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerStarted","Data":"684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6"} Mar 18 14:00:51 crc kubenswrapper[4912]: I0318 14:00:51.968683 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:00:52 crc kubenswrapper[4912]: I0318 14:00:52.704916 4912 generic.go:334] "Generic (PLEG): container finished" podID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerID="61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07" exitCode=0 Mar 18 14:00:52 crc kubenswrapper[4912]: I0318 14:00:52.705124 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerDied","Data":"61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07"} Mar 18 14:00:52 crc kubenswrapper[4912]: I0318 14:00:52.709113 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerStarted","Data":"22b46f3428dd5c85e61fae9a64bcdc8a4172d738c76f8e647c59fe794d43a54a"} Mar 18 14:00:53 crc kubenswrapper[4912]: E0318 14:00:53.669922 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fd2c6b_331e_4a51_b3a6_305c0ff1ed64.slice/crio-684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fd2c6b_331e_4a51_b3a6_305c0ff1ed64.slice/crio-conmon-684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:00:53 crc kubenswrapper[4912]: I0318 14:00:53.723267 4912 generic.go:334] "Generic (PLEG): container finished" podID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerID="684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6" exitCode=0 Mar 18 14:00:53 crc kubenswrapper[4912]: I0318 14:00:53.723357 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerDied","Data":"684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6"} Mar 18 14:00:53 crc kubenswrapper[4912]: I0318 14:00:53.729286 4912 generic.go:334] "Generic (PLEG): container finished" podID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerID="e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c" exitCode=0 Mar 18 14:00:53 crc kubenswrapper[4912]: I0318 14:00:53.729366 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerDied","Data":"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c"} Mar 18 14:00:54 crc kubenswrapper[4912]: I0318 14:00:54.747638 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerStarted","Data":"58d94074bee91bd49f6578b21dfb90145a779f864579c67a73aef26c622e70cd"} Mar 18 14:00:54 crc kubenswrapper[4912]: I0318 14:00:54.752441 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerStarted","Data":"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf"} Mar 18 14:00:54 crc kubenswrapper[4912]: I0318 14:00:54.763231 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerStarted","Data":"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150"} Mar 18 14:00:54 crc kubenswrapper[4912]: I0318 14:00:54.783229 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xzjsd" podStartSLOduration=3.224226229 podStartE2EDuration="7.783204014s" podCreationTimestamp="2026-03-18 14:00:47 +0000 UTC" firstStartedPulling="2026-03-18 14:00:49.614368888 +0000 UTC m=+3498.073796313" lastFinishedPulling="2026-03-18 14:00:54.173346673 +0000 UTC m=+3502.632774098" observedRunningTime="2026-03-18 14:00:54.782611688 +0000 UTC m=+3503.242039113" watchObservedRunningTime="2026-03-18 14:00:54.783204014 +0000 UTC m=+3503.242631439" Mar 18 14:00:54 crc kubenswrapper[4912]: I0318 14:00:54.834264 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pjp25" podStartSLOduration=3.259788543 podStartE2EDuration="7.834242219s" podCreationTimestamp="2026-03-18 14:00:47 +0000 UTC" firstStartedPulling="2026-03-18 14:00:49.605107326 +0000 UTC m=+3498.064534741" lastFinishedPulling="2026-03-18 14:00:54.179560992 +0000 UTC m=+3502.638988417" observedRunningTime="2026-03-18 14:00:54.832139082 +0000 UTC m=+3503.291566637" watchObservedRunningTime="2026-03-18 14:00:54.834242219 +0000 UTC m=+3503.293669644" Mar 18 14:00:58 crc kubenswrapper[4912]: I0318 14:00:58.100682 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:58 crc kubenswrapper[4912]: I0318 14:00:58.103621 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:00:58 crc kubenswrapper[4912]: I0318 14:00:58.348956 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:58 crc kubenswrapper[4912]: I0318 14:00:58.349058 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:00:59 crc kubenswrapper[4912]: I0318 14:00:59.158514 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xzjsd" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="registry-server" probeResult="failure" output=< Mar 18 14:00:59 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:00:59 crc kubenswrapper[4912]: > Mar 18 14:00:59 crc kubenswrapper[4912]: I0318 14:00:59.407898 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pjp25" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" probeResult="failure" output=< Mar 18 14:00:59 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:00:59 crc kubenswrapper[4912]: > Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.203358 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564041-fj5hw"] Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.210511 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.228143 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564041-fj5hw"] Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.287616 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.287759 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9988\" (UniqueName: \"kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.289228 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.289289 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.392033 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9988\" (UniqueName: \"kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.392608 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.392634 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.392876 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.401278 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.403081 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.404483 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.412882 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9988\" (UniqueName: \"kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988\") pod \"keystone-cron-29564041-fj5hw\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.564735 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.845842 4912 generic.go:334] "Generic (PLEG): container finished" podID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerID="ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf" exitCode=0 Mar 18 14:01:00 crc kubenswrapper[4912]: I0318 14:01:00.845919 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerDied","Data":"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf"} Mar 18 14:01:01 crc kubenswrapper[4912]: I0318 14:01:01.228431 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:01:01 crc kubenswrapper[4912]: E0318 14:01:01.228861 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:01:01 crc kubenswrapper[4912]: I0318 14:01:01.865831 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerStarted","Data":"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251"} Mar 18 14:01:01 crc kubenswrapper[4912]: I0318 14:01:01.901585 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mphx5" podStartSLOduration=3.23180662 podStartE2EDuration="11.901552757s" podCreationTimestamp="2026-03-18 14:00:50 +0000 UTC" firstStartedPulling="2026-03-18 14:00:52.719094776 +0000 UTC m=+3501.178522201" lastFinishedPulling="2026-03-18 14:01:01.388840913 +0000 UTC m=+3509.848268338" observedRunningTime="2026-03-18 14:01:01.888685588 +0000 UTC m=+3510.348113013" watchObservedRunningTime="2026-03-18 14:01:01.901552757 +0000 UTC m=+3510.360980182" Mar 18 14:01:02 crc kubenswrapper[4912]: I0318 14:01:02.246007 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564041-fj5hw"] Mar 18 14:01:02 crc kubenswrapper[4912]: I0318 14:01:02.885539 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-fj5hw" event={"ID":"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8","Type":"ContainerStarted","Data":"b2222667cd0ca684406c8050fb32f1cae297764163c548932f174bd8319ec009"} Mar 18 14:01:02 crc kubenswrapper[4912]: I0318 14:01:02.885914 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-fj5hw" event={"ID":"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8","Type":"ContainerStarted","Data":"7b501a90890a34524cc712c7ef050b845a6eb44792dc932cc8f501be61779b06"} Mar 18 14:01:02 crc kubenswrapper[4912]: I0318 14:01:02.923737 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564041-fj5hw" podStartSLOduration=2.923710517 podStartE2EDuration="2.923710517s" podCreationTimestamp="2026-03-18 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:01:02.911786223 +0000 UTC m=+3511.371213658" watchObservedRunningTime="2026-03-18 14:01:02.923710517 +0000 UTC m=+3511.383137942" Mar 18 14:01:04 crc kubenswrapper[4912]: I0318 14:01:04.853229 4912 scope.go:117] "RemoveContainer" containerID="2b8391e8a5000fd9ad79501cf910594a527cb4f931dfaa1905e8cdf38240ac23" Mar 18 14:01:04 crc kubenswrapper[4912]: I0318 14:01:04.933065 4912 scope.go:117] "RemoveContainer" containerID="7a656076ccfc09fe0621f552526bdc7d6f53cc400731e10a86c2a858fe166539" Mar 18 14:01:06 crc kubenswrapper[4912]: I0318 14:01:06.950509 4912 generic.go:334] "Generic (PLEG): container finished" podID="05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" containerID="b2222667cd0ca684406c8050fb32f1cae297764163c548932f174bd8319ec009" exitCode=0 Mar 18 14:01:06 crc kubenswrapper[4912]: I0318 14:01:06.950595 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-fj5hw" event={"ID":"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8","Type":"ContainerDied","Data":"b2222667cd0ca684406c8050fb32f1cae297764163c548932f174bd8319ec009"} Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.179723 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.259972 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.434743 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.461605 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle\") pod \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.530657 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" (UID: "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.564980 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data\") pod \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.565149 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9988\" (UniqueName: \"kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988\") pod \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.565332 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys\") pod \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\" (UID: \"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8\") " Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.568740 4912 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.571960 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988" (OuterVolumeSpecName: "kube-api-access-j9988") pod "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" (UID: "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8"). InnerVolumeSpecName "kube-api-access-j9988". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.573184 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" (UID: "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.652550 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data" (OuterVolumeSpecName: "config-data") pod "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" (UID: "05fe35d1-77ac-4349-b1a0-25cfd25bc5a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.671716 4912 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.671771 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.671784 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9988\" (UniqueName: \"kubernetes.io/projected/05fe35d1-77ac-4349-b1a0-25cfd25bc5a8-kube-api-access-j9988\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.978085 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-fj5hw" event={"ID":"05fe35d1-77ac-4349-b1a0-25cfd25bc5a8","Type":"ContainerDied","Data":"7b501a90890a34524cc712c7ef050b845a6eb44792dc932cc8f501be61779b06"} Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.978159 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b501a90890a34524cc712c7ef050b845a6eb44792dc932cc8f501be61779b06" Mar 18 14:01:08 crc kubenswrapper[4912]: I0318 14:01:08.978174 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-fj5hw" Mar 18 14:01:09 crc kubenswrapper[4912]: I0318 14:01:09.405144 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pjp25" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" probeResult="failure" output=< Mar 18 14:01:09 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:01:09 crc kubenswrapper[4912]: > Mar 18 14:01:11 crc kubenswrapper[4912]: I0318 14:01:11.381062 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:11 crc kubenswrapper[4912]: I0318 14:01:11.381635 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:11 crc kubenswrapper[4912]: I0318 14:01:11.746820 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:01:11 crc kubenswrapper[4912]: I0318 14:01:11.747200 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xzjsd" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="registry-server" containerID="cri-o://58d94074bee91bd49f6578b21dfb90145a779f864579c67a73aef26c622e70cd" gracePeriod=2 Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.020827 4912 generic.go:334] "Generic (PLEG): container finished" podID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerID="58d94074bee91bd49f6578b21dfb90145a779f864579c67a73aef26c622e70cd" exitCode=0 Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.020917 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerDied","Data":"58d94074bee91bd49f6578b21dfb90145a779f864579c67a73aef26c622e70cd"} Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.398403 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.436912 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mphx5" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="registry-server" probeResult="failure" output=< Mar 18 14:01:12 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:01:12 crc kubenswrapper[4912]: > Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.486543 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities\") pod \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.487060 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content\") pod \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.487296 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities" (OuterVolumeSpecName: "utilities") pod "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" (UID: "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.487366 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86wnb\" (UniqueName: \"kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb\") pod \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\" (UID: \"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64\") " Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.488462 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.494699 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb" (OuterVolumeSpecName: "kube-api-access-86wnb") pod "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" (UID: "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64"). InnerVolumeSpecName "kube-api-access-86wnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.555757 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" (UID: "38fd2c6b-331e-4a51-b3a6-305c0ff1ed64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.591092 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86wnb\" (UniqueName: \"kubernetes.io/projected/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-kube-api-access-86wnb\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:12 crc kubenswrapper[4912]: I0318 14:01:12.591148 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.039270 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjsd" Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.039310 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjsd" event={"ID":"38fd2c6b-331e-4a51-b3a6-305c0ff1ed64","Type":"ContainerDied","Data":"fafdef5a9530fd96fc4ea702dd17c566e366fdbb2ba0499ab0a7fe6a1f83d6ff"} Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.039429 4912 scope.go:117] "RemoveContainer" containerID="58d94074bee91bd49f6578b21dfb90145a779f864579c67a73aef26c622e70cd" Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.090503 4912 scope.go:117] "RemoveContainer" containerID="684cd3f65a0f76fb77d5bda5244fa8cfbca3fbf36616bafa8ffdc43efa41cfa6" Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.098480 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.117578 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xzjsd"] Mar 18 14:01:13 crc kubenswrapper[4912]: I0318 14:01:13.118513 4912 scope.go:117] "RemoveContainer" containerID="06f435ce9d566275332ad0312e2ba5e284e87c8a0a647afc0d4144d9423068bc" Mar 18 14:01:14 crc kubenswrapper[4912]: I0318 14:01:14.228671 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:01:14 crc kubenswrapper[4912]: I0318 14:01:14.272653 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" path="/var/lib/kubelet/pods/38fd2c6b-331e-4a51-b3a6-305c0ff1ed64/volumes" Mar 18 14:01:15 crc kubenswrapper[4912]: I0318 14:01:15.076230 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43"} Mar 18 14:01:18 crc kubenswrapper[4912]: I0318 14:01:18.404747 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:01:18 crc kubenswrapper[4912]: I0318 14:01:18.466266 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:01:19 crc kubenswrapper[4912]: I0318 14:01:19.942912 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.146079 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pjp25" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" containerID="cri-o://35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150" gracePeriod=2 Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.813453 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.887114 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities\") pod \"9cc7bd6d-e6d3-483e-aa55-012be085a251\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.888133 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities" (OuterVolumeSpecName: "utilities") pod "9cc7bd6d-e6d3-483e-aa55-012be085a251" (UID: "9cc7bd6d-e6d3-483e-aa55-012be085a251"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.888618 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsrlm\" (UniqueName: \"kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm\") pod \"9cc7bd6d-e6d3-483e-aa55-012be085a251\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.888721 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content\") pod \"9cc7bd6d-e6d3-483e-aa55-012be085a251\" (UID: \"9cc7bd6d-e6d3-483e-aa55-012be085a251\") " Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.911202 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm" (OuterVolumeSpecName: "kube-api-access-lsrlm") pod "9cc7bd6d-e6d3-483e-aa55-012be085a251" (UID: "9cc7bd6d-e6d3-483e-aa55-012be085a251"). InnerVolumeSpecName "kube-api-access-lsrlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.929772 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.929819 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsrlm\" (UniqueName: \"kubernetes.io/projected/9cc7bd6d-e6d3-483e-aa55-012be085a251-kube-api-access-lsrlm\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:20 crc kubenswrapper[4912]: I0318 14:01:20.978114 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cc7bd6d-e6d3-483e-aa55-012be085a251" (UID: "9cc7bd6d-e6d3-483e-aa55-012be085a251"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.032591 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc7bd6d-e6d3-483e-aa55-012be085a251-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.164288 4912 generic.go:334] "Generic (PLEG): container finished" podID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerID="35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150" exitCode=0 Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.164371 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerDied","Data":"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150"} Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.164413 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjp25" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.164436 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjp25" event={"ID":"9cc7bd6d-e6d3-483e-aa55-012be085a251","Type":"ContainerDied","Data":"0843a9b7a17b0b60eccc2701d0e2133430d4e28b925474021fba9effb4801612"} Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.164463 4912 scope.go:117] "RemoveContainer" containerID="35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.197689 4912 scope.go:117] "RemoveContainer" containerID="e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.300296 4912 scope.go:117] "RemoveContainer" containerID="a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.326521 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.337575 4912 scope.go:117] "RemoveContainer" containerID="35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150" Mar 18 14:01:21 crc kubenswrapper[4912]: E0318 14:01:21.338580 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150\": container with ID starting with 35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150 not found: ID does not exist" containerID="35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.338664 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150"} err="failed to get container status \"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150\": rpc error: code = NotFound desc = could not find container \"35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150\": container with ID starting with 35e6a2ed2214690fe0f9566b1682d02337d5593fb201dc863cf26b28ba02a150 not found: ID does not exist" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.338700 4912 scope.go:117] "RemoveContainer" containerID="e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c" Mar 18 14:01:21 crc kubenswrapper[4912]: E0318 14:01:21.339355 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c\": container with ID starting with e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c not found: ID does not exist" containerID="e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.339424 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c"} err="failed to get container status \"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c\": rpc error: code = NotFound desc = could not find container \"e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c\": container with ID starting with e3b89fa0dafa717920a4d57081013cdc45ee3707831a5d081ed4131262f06d1c not found: ID does not exist" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.339447 4912 scope.go:117] "RemoveContainer" containerID="a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7" Mar 18 14:01:21 crc kubenswrapper[4912]: E0318 14:01:21.339696 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7\": container with ID starting with a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7 not found: ID does not exist" containerID="a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.339721 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7"} err="failed to get container status \"a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7\": rpc error: code = NotFound desc = could not find container \"a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7\": container with ID starting with a57768a72d656f1b39c2914fdb8f387ed6ac580c57ff2600057b8dcb104864b7 not found: ID does not exist" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.367503 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pjp25"] Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.444900 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:21 crc kubenswrapper[4912]: I0318 14:01:21.504617 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:22 crc kubenswrapper[4912]: I0318 14:01:22.242161 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" path="/var/lib/kubelet/pods/9cc7bd6d-e6d3-483e-aa55-012be085a251/volumes" Mar 18 14:01:23 crc kubenswrapper[4912]: I0318 14:01:23.339724 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:01:23 crc kubenswrapper[4912]: I0318 14:01:23.340613 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mphx5" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="registry-server" containerID="cri-o://3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251" gracePeriod=2 Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.042323 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.147432 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities\") pod \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.147796 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content\") pod \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.147933 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtml\" (UniqueName: \"kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml\") pod \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\" (UID: \"0096eacb-f320-4527-a9d3-fff2a7e8f90d\") " Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.149073 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities" (OuterVolumeSpecName: "utilities") pod "0096eacb-f320-4527-a9d3-fff2a7e8f90d" (UID: "0096eacb-f320-4527-a9d3-fff2a7e8f90d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.216368 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml" (OuterVolumeSpecName: "kube-api-access-hhtml") pod "0096eacb-f320-4527-a9d3-fff2a7e8f90d" (UID: "0096eacb-f320-4527-a9d3-fff2a7e8f90d"). InnerVolumeSpecName "kube-api-access-hhtml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.257506 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.270368 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhtml\" (UniqueName: \"kubernetes.io/projected/0096eacb-f320-4527-a9d3-fff2a7e8f90d-kube-api-access-hhtml\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.292067 4912 generic.go:334] "Generic (PLEG): container finished" podID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerID="3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251" exitCode=0 Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.292186 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mphx5" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.293967 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerDied","Data":"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251"} Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.294003 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mphx5" event={"ID":"0096eacb-f320-4527-a9d3-fff2a7e8f90d","Type":"ContainerDied","Data":"22b46f3428dd5c85e61fae9a64bcdc8a4172d738c76f8e647c59fe794d43a54a"} Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.294027 4912 scope.go:117] "RemoveContainer" containerID="3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.350627 4912 scope.go:117] "RemoveContainer" containerID="ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.388501 4912 scope.go:117] "RemoveContainer" containerID="61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.445687 4912 scope.go:117] "RemoveContainer" containerID="3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251" Mar 18 14:01:24 crc kubenswrapper[4912]: E0318 14:01:24.451241 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251\": container with ID starting with 3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251 not found: ID does not exist" containerID="3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.451306 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251"} err="failed to get container status \"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251\": rpc error: code = NotFound desc = could not find container \"3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251\": container with ID starting with 3a546db5758567a1083d5c43df51fa66143ffef8f4207beaee172399cb982251 not found: ID does not exist" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.451347 4912 scope.go:117] "RemoveContainer" containerID="ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf" Mar 18 14:01:24 crc kubenswrapper[4912]: E0318 14:01:24.451708 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf\": container with ID starting with ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf not found: ID does not exist" containerID="ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.451736 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf"} err="failed to get container status \"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf\": rpc error: code = NotFound desc = could not find container \"ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf\": container with ID starting with ec672f67c9db07c1cbec6e4897796ddc761314d5bad2cbb4e47506ef3e6fbccf not found: ID does not exist" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.451754 4912 scope.go:117] "RemoveContainer" containerID="61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07" Mar 18 14:01:24 crc kubenswrapper[4912]: E0318 14:01:24.452285 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07\": container with ID starting with 61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07 not found: ID does not exist" containerID="61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.452314 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07"} err="failed to get container status \"61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07\": rpc error: code = NotFound desc = could not find container \"61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07\": container with ID starting with 61711a42f2b98a6a3edf93ca453a964a247f11fde93913c61db0fdccd4510a07 not found: ID does not exist" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.472564 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0096eacb-f320-4527-a9d3-fff2a7e8f90d" (UID: "0096eacb-f320-4527-a9d3-fff2a7e8f90d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.477328 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0096eacb-f320-4527-a9d3-fff2a7e8f90d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.637515 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:01:24 crc kubenswrapper[4912]: I0318 14:01:24.654973 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mphx5"] Mar 18 14:01:26 crc kubenswrapper[4912]: I0318 14:01:26.243873 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" path="/var/lib/kubelet/pods/0096eacb-f320-4527-a9d3-fff2a7e8f90d/volumes" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.155212 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564042-9c8xl"] Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157075 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157104 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157141 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157154 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157183 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157199 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157223 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157234 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157265 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157277 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="extract-utilities" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157304 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157316 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157377 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157389 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157410 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157421 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157448 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157461 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="extract-content" Mar 18 14:02:00 crc kubenswrapper[4912]: E0318 14:02:00.157508 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157521 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157897 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc7bd6d-e6d3-483e-aa55-012be085a251" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157939 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fe35d1-77ac-4349-b1a0-25cfd25bc5a8" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157959 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="0096eacb-f320-4527-a9d3-fff2a7e8f90d" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.157991 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fd2c6b-331e-4a51-b3a6-305c0ff1ed64" containerName="registry-server" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.159424 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.162199 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.162351 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.162376 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.175656 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-9c8xl"] Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.233017 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wlp8\" (UniqueName: \"kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8\") pod \"auto-csr-approver-29564042-9c8xl\" (UID: \"36bb26ac-5ec1-404c-8faf-3c5d40bb699d\") " pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.336164 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wlp8\" (UniqueName: \"kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8\") pod \"auto-csr-approver-29564042-9c8xl\" (UID: \"36bb26ac-5ec1-404c-8faf-3c5d40bb699d\") " pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.359941 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wlp8\" (UniqueName: \"kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8\") pod \"auto-csr-approver-29564042-9c8xl\" (UID: \"36bb26ac-5ec1-404c-8faf-3c5d40bb699d\") " pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:00 crc kubenswrapper[4912]: I0318 14:02:00.487005 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:01 crc kubenswrapper[4912]: I0318 14:02:01.038192 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-9c8xl"] Mar 18 14:02:01 crc kubenswrapper[4912]: I0318 14:02:01.901066 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" event={"ID":"36bb26ac-5ec1-404c-8faf-3c5d40bb699d","Type":"ContainerStarted","Data":"237bd57436e7f9105a4c75ad59c109df8ca9927cadf5ea8b5f404f7c3ea887b2"} Mar 18 14:02:02 crc kubenswrapper[4912]: I0318 14:02:02.917912 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" event={"ID":"36bb26ac-5ec1-404c-8faf-3c5d40bb699d","Type":"ContainerStarted","Data":"e261d92a7ff0e096355713ef04c6f12ec770dda9a5317cd2eb9bd3fa98915fc6"} Mar 18 14:02:03 crc kubenswrapper[4912]: I0318 14:02:03.936646 4912 generic.go:334] "Generic (PLEG): container finished" podID="36bb26ac-5ec1-404c-8faf-3c5d40bb699d" containerID="e261d92a7ff0e096355713ef04c6f12ec770dda9a5317cd2eb9bd3fa98915fc6" exitCode=0 Mar 18 14:02:03 crc kubenswrapper[4912]: I0318 14:02:03.936807 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" event={"ID":"36bb26ac-5ec1-404c-8faf-3c5d40bb699d","Type":"ContainerDied","Data":"e261d92a7ff0e096355713ef04c6f12ec770dda9a5317cd2eb9bd3fa98915fc6"} Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.536829 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.623462 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wlp8\" (UniqueName: \"kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8\") pod \"36bb26ac-5ec1-404c-8faf-3c5d40bb699d\" (UID: \"36bb26ac-5ec1-404c-8faf-3c5d40bb699d\") " Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.631660 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8" (OuterVolumeSpecName: "kube-api-access-8wlp8") pod "36bb26ac-5ec1-404c-8faf-3c5d40bb699d" (UID: "36bb26ac-5ec1-404c-8faf-3c5d40bb699d"). InnerVolumeSpecName "kube-api-access-8wlp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.727940 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wlp8\" (UniqueName: \"kubernetes.io/projected/36bb26ac-5ec1-404c-8faf-3c5d40bb699d-kube-api-access-8wlp8\") on node \"crc\" DevicePath \"\"" Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.971459 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" event={"ID":"36bb26ac-5ec1-404c-8faf-3c5d40bb699d","Type":"ContainerDied","Data":"237bd57436e7f9105a4c75ad59c109df8ca9927cadf5ea8b5f404f7c3ea887b2"} Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.971511 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="237bd57436e7f9105a4c75ad59c109df8ca9927cadf5ea8b5f404f7c3ea887b2" Mar 18 14:02:05 crc kubenswrapper[4912]: I0318 14:02:05.971519 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-9c8xl" Mar 18 14:02:06 crc kubenswrapper[4912]: I0318 14:02:06.639658 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-rdltz"] Mar 18 14:02:06 crc kubenswrapper[4912]: I0318 14:02:06.656498 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-rdltz"] Mar 18 14:02:08 crc kubenswrapper[4912]: I0318 14:02:08.242807 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248a7634-1c0c-4e94-bb24-8bf23e43b2de" path="/var/lib/kubelet/pods/248a7634-1c0c-4e94-bb24-8bf23e43b2de/volumes" Mar 18 14:03:05 crc kubenswrapper[4912]: I0318 14:03:05.257447 4912 scope.go:117] "RemoveContainer" containerID="f3563743bb77423ff5879fbb37c17e2a089e0fc3a5559727d5822053cfce3dff" Mar 18 14:03:36 crc kubenswrapper[4912]: I0318 14:03:36.998603 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:03:37 crc kubenswrapper[4912]: I0318 14:03:36.999578 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.162702 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564044-2lg7g"] Mar 18 14:04:00 crc kubenswrapper[4912]: E0318 14:04:00.164438 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bb26ac-5ec1-404c-8faf-3c5d40bb699d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.164461 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bb26ac-5ec1-404c-8faf-3c5d40bb699d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.164849 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bb26ac-5ec1-404c-8faf-3c5d40bb699d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.166090 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.170773 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.170838 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.171129 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.178088 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-2lg7g"] Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.351303 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbz9\" (UniqueName: \"kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9\") pod \"auto-csr-approver-29564044-2lg7g\" (UID: \"99debac2-992e-4f94-8f0f-d1c206348a7a\") " pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.455229 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbz9\" (UniqueName: \"kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9\") pod \"auto-csr-approver-29564044-2lg7g\" (UID: \"99debac2-992e-4f94-8f0f-d1c206348a7a\") " pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.477771 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbz9\" (UniqueName: \"kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9\") pod \"auto-csr-approver-29564044-2lg7g\" (UID: \"99debac2-992e-4f94-8f0f-d1c206348a7a\") " pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:00 crc kubenswrapper[4912]: I0318 14:04:00.493154 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:01 crc kubenswrapper[4912]: I0318 14:04:01.027279 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-2lg7g"] Mar 18 14:04:01 crc kubenswrapper[4912]: I0318 14:04:01.568493 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" event={"ID":"99debac2-992e-4f94-8f0f-d1c206348a7a","Type":"ContainerStarted","Data":"bd1ab6f9a05de4d44709678325a871dd2b8aac64a9dab9b4441319483e9eece6"} Mar 18 14:04:02 crc kubenswrapper[4912]: I0318 14:04:02.593763 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" event={"ID":"99debac2-992e-4f94-8f0f-d1c206348a7a","Type":"ContainerStarted","Data":"29c588cc977d164cd989e7cdd5b9f2693c4aa5347ffc6851045165e27c699583"} Mar 18 14:04:02 crc kubenswrapper[4912]: I0318 14:04:02.627773 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" podStartSLOduration=1.509697279 podStartE2EDuration="2.62773738s" podCreationTimestamp="2026-03-18 14:04:00 +0000 UTC" firstStartedPulling="2026-03-18 14:04:01.027708388 +0000 UTC m=+3689.487135813" lastFinishedPulling="2026-03-18 14:04:02.145748469 +0000 UTC m=+3690.605175914" observedRunningTime="2026-03-18 14:04:02.614190643 +0000 UTC m=+3691.073618088" watchObservedRunningTime="2026-03-18 14:04:02.62773738 +0000 UTC m=+3691.087164815" Mar 18 14:04:03 crc kubenswrapper[4912]: I0318 14:04:03.607605 4912 generic.go:334] "Generic (PLEG): container finished" podID="99debac2-992e-4f94-8f0f-d1c206348a7a" containerID="29c588cc977d164cd989e7cdd5b9f2693c4aa5347ffc6851045165e27c699583" exitCode=0 Mar 18 14:04:03 crc kubenswrapper[4912]: I0318 14:04:03.607688 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" event={"ID":"99debac2-992e-4f94-8f0f-d1c206348a7a","Type":"ContainerDied","Data":"29c588cc977d164cd989e7cdd5b9f2693c4aa5347ffc6851045165e27c699583"} Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.176312 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.310323 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbz9\" (UniqueName: \"kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9\") pod \"99debac2-992e-4f94-8f0f-d1c206348a7a\" (UID: \"99debac2-992e-4f94-8f0f-d1c206348a7a\") " Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.327688 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9" (OuterVolumeSpecName: "kube-api-access-fwbz9") pod "99debac2-992e-4f94-8f0f-d1c206348a7a" (UID: "99debac2-992e-4f94-8f0f-d1c206348a7a"). InnerVolumeSpecName "kube-api-access-fwbz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.366658 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-vgjgt"] Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.380157 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-vgjgt"] Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.416208 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbz9\" (UniqueName: \"kubernetes.io/projected/99debac2-992e-4f94-8f0f-d1c206348a7a-kube-api-access-fwbz9\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.632892 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" event={"ID":"99debac2-992e-4f94-8f0f-d1c206348a7a","Type":"ContainerDied","Data":"bd1ab6f9a05de4d44709678325a871dd2b8aac64a9dab9b4441319483e9eece6"} Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.632940 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-2lg7g" Mar 18 14:04:05 crc kubenswrapper[4912]: I0318 14:04:05.632941 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1ab6f9a05de4d44709678325a871dd2b8aac64a9dab9b4441319483e9eece6" Mar 18 14:04:06 crc kubenswrapper[4912]: I0318 14:04:06.248647 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba93aeb-6b96-4a8f-a5e7-8f426adb327c" path="/var/lib/kubelet/pods/9ba93aeb-6b96-4a8f-a5e7-8f426adb327c/volumes" Mar 18 14:04:06 crc kubenswrapper[4912]: I0318 14:04:06.998872 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:04:06 crc kubenswrapper[4912]: I0318 14:04:06.998968 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:37 crc kubenswrapper[4912]: I0318 14:04:36.999868 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:04:37 crc kubenswrapper[4912]: I0318 14:04:37.000651 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:37 crc kubenswrapper[4912]: I0318 14:04:37.000714 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:04:37 crc kubenswrapper[4912]: I0318 14:04:37.002262 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:04:37 crc kubenswrapper[4912]: I0318 14:04:37.002338 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43" gracePeriod=600 Mar 18 14:04:38 crc kubenswrapper[4912]: I0318 14:04:38.082320 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43" exitCode=0 Mar 18 14:04:38 crc kubenswrapper[4912]: I0318 14:04:38.082360 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43"} Mar 18 14:04:38 crc kubenswrapper[4912]: I0318 14:04:38.083124 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3"} Mar 18 14:04:38 crc kubenswrapper[4912]: I0318 14:04:38.083164 4912 scope.go:117] "RemoveContainer" containerID="85f6c8214226eb9f021134308031fcec6af7f00d3d0174ed50bf15cf495cd7dc" Mar 18 14:05:05 crc kubenswrapper[4912]: I0318 14:05:05.419888 4912 scope.go:117] "RemoveContainer" containerID="d5295277e69ed839713a9bef40b446aec53e72fefeea4825c893a54ea0cd42a2" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.152579 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564046-ghdb9"] Mar 18 14:06:00 crc kubenswrapper[4912]: E0318 14:06:00.153893 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99debac2-992e-4f94-8f0f-d1c206348a7a" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.153912 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="99debac2-992e-4f94-8f0f-d1c206348a7a" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.154319 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="99debac2-992e-4f94-8f0f-d1c206348a7a" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.155486 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.164415 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-ghdb9"] Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.189358 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.189820 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.189924 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.238845 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnj2\" (UniqueName: \"kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2\") pod \"auto-csr-approver-29564046-ghdb9\" (UID: \"b8c1ce29-8a72-4ad2-b226-f39747eba1c2\") " pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.341774 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnj2\" (UniqueName: \"kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2\") pod \"auto-csr-approver-29564046-ghdb9\" (UID: \"b8c1ce29-8a72-4ad2-b226-f39747eba1c2\") " pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.369168 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnj2\" (UniqueName: \"kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2\") pod \"auto-csr-approver-29564046-ghdb9\" (UID: \"b8c1ce29-8a72-4ad2-b226-f39747eba1c2\") " pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:00 crc kubenswrapper[4912]: I0318 14:06:00.509086 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:01 crc kubenswrapper[4912]: I0318 14:06:01.032729 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-ghdb9"] Mar 18 14:06:01 crc kubenswrapper[4912]: I0318 14:06:01.041510 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:06:01 crc kubenswrapper[4912]: I0318 14:06:01.166282 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" event={"ID":"b8c1ce29-8a72-4ad2-b226-f39747eba1c2","Type":"ContainerStarted","Data":"3ad958c795e7c7de9d62382eb7d58bce7ecdebf7a5e53595a52be80a15682a29"} Mar 18 14:06:03 crc kubenswrapper[4912]: I0318 14:06:03.191498 4912 generic.go:334] "Generic (PLEG): container finished" podID="b8c1ce29-8a72-4ad2-b226-f39747eba1c2" containerID="604d177f2038a60e87d61e60f1217bebbf0482d64195eda499bf097ca909395c" exitCode=0 Mar 18 14:06:03 crc kubenswrapper[4912]: I0318 14:06:03.191609 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" event={"ID":"b8c1ce29-8a72-4ad2-b226-f39747eba1c2","Type":"ContainerDied","Data":"604d177f2038a60e87d61e60f1217bebbf0482d64195eda499bf097ca909395c"} Mar 18 14:06:04 crc kubenswrapper[4912]: I0318 14:06:04.612641 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:04 crc kubenswrapper[4912]: I0318 14:06:04.708791 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnj2\" (UniqueName: \"kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2\") pod \"b8c1ce29-8a72-4ad2-b226-f39747eba1c2\" (UID: \"b8c1ce29-8a72-4ad2-b226-f39747eba1c2\") " Mar 18 14:06:04 crc kubenswrapper[4912]: I0318 14:06:04.718563 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2" (OuterVolumeSpecName: "kube-api-access-5jnj2") pod "b8c1ce29-8a72-4ad2-b226-f39747eba1c2" (UID: "b8c1ce29-8a72-4ad2-b226-f39747eba1c2"). InnerVolumeSpecName "kube-api-access-5jnj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:06:04 crc kubenswrapper[4912]: I0318 14:06:04.812898 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnj2\" (UniqueName: \"kubernetes.io/projected/b8c1ce29-8a72-4ad2-b226-f39747eba1c2-kube-api-access-5jnj2\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:05 crc kubenswrapper[4912]: I0318 14:06:05.233509 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" event={"ID":"b8c1ce29-8a72-4ad2-b226-f39747eba1c2","Type":"ContainerDied","Data":"3ad958c795e7c7de9d62382eb7d58bce7ecdebf7a5e53595a52be80a15682a29"} Mar 18 14:06:05 crc kubenswrapper[4912]: I0318 14:06:05.233587 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad958c795e7c7de9d62382eb7d58bce7ecdebf7a5e53595a52be80a15682a29" Mar 18 14:06:05 crc kubenswrapper[4912]: I0318 14:06:05.234116 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-ghdb9" Mar 18 14:06:05 crc kubenswrapper[4912]: I0318 14:06:05.704138 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-kllnj"] Mar 18 14:06:05 crc kubenswrapper[4912]: I0318 14:06:05.723478 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-kllnj"] Mar 18 14:06:06 crc kubenswrapper[4912]: I0318 14:06:06.247351 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645d1335-bcd3-4833-9d80-3d7b3403ce39" path="/var/lib/kubelet/pods/645d1335-bcd3-4833-9d80-3d7b3403ce39/volumes" Mar 18 14:07:05 crc kubenswrapper[4912]: I0318 14:07:05.548878 4912 scope.go:117] "RemoveContainer" containerID="5a16dbd89408fd77e1fc00542c324a222d7e32594c482115d04bd7867895a63d" Mar 18 14:07:06 crc kubenswrapper[4912]: I0318 14:07:06.998799 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:07:07 crc kubenswrapper[4912]: I0318 14:07:06.999178 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:07:08 crc kubenswrapper[4912]: I0318 14:07:08.733730 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:07:37 crc kubenswrapper[4912]: I0318 14:07:36.999460 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:07:37 crc kubenswrapper[4912]: I0318 14:07:37.000266 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:07:46 crc kubenswrapper[4912]: I0318 14:07:46.993288 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:07:46 crc kubenswrapper[4912]: E0318 14:07:46.994914 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c1ce29-8a72-4ad2-b226-f39747eba1c2" containerName="oc" Mar 18 14:07:46 crc kubenswrapper[4912]: I0318 14:07:46.994935 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c1ce29-8a72-4ad2-b226-f39747eba1c2" containerName="oc" Mar 18 14:07:46 crc kubenswrapper[4912]: I0318 14:07:46.995351 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c1ce29-8a72-4ad2-b226-f39747eba1c2" containerName="oc" Mar 18 14:07:46 crc kubenswrapper[4912]: I0318 14:07:46.997880 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.013286 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.113817 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.113894 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qv9\" (UniqueName: \"kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.113966 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.217873 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.218265 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.218319 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qv9\" (UniqueName: \"kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.218618 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.219067 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.250194 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qv9\" (UniqueName: \"kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9\") pod \"redhat-marketplace-wnpsm\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.339733 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:47 crc kubenswrapper[4912]: I0318 14:07:47.976944 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:07:48 crc kubenswrapper[4912]: I0318 14:07:48.684518 4912 generic.go:334] "Generic (PLEG): container finished" podID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerID="a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a" exitCode=0 Mar 18 14:07:48 crc kubenswrapper[4912]: I0318 14:07:48.684586 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerDied","Data":"a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a"} Mar 18 14:07:48 crc kubenswrapper[4912]: I0318 14:07:48.685363 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerStarted","Data":"8d165affffd590aacf40ff60429758a8fcaa2908d72d911df35275b2c6a9238c"} Mar 18 14:07:49 crc kubenswrapper[4912]: I0318 14:07:49.706388 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerStarted","Data":"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3"} Mar 18 14:07:51 crc kubenswrapper[4912]: I0318 14:07:51.731446 4912 generic.go:334] "Generic (PLEG): container finished" podID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerID="8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3" exitCode=0 Mar 18 14:07:51 crc kubenswrapper[4912]: I0318 14:07:51.731643 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerDied","Data":"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3"} Mar 18 14:07:52 crc kubenswrapper[4912]: I0318 14:07:52.768800 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerStarted","Data":"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4"} Mar 18 14:07:52 crc kubenswrapper[4912]: I0318 14:07:52.802764 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wnpsm" podStartSLOduration=3.187437285 podStartE2EDuration="6.802732369s" podCreationTimestamp="2026-03-18 14:07:46 +0000 UTC" firstStartedPulling="2026-03-18 14:07:48.689926443 +0000 UTC m=+3917.149353868" lastFinishedPulling="2026-03-18 14:07:52.305221527 +0000 UTC m=+3920.764648952" observedRunningTime="2026-03-18 14:07:52.789411298 +0000 UTC m=+3921.248838743" watchObservedRunningTime="2026-03-18 14:07:52.802732369 +0000 UTC m=+3921.262159794" Mar 18 14:07:57 crc kubenswrapper[4912]: I0318 14:07:57.340531 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:57 crc kubenswrapper[4912]: I0318 14:07:57.341193 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:57 crc kubenswrapper[4912]: I0318 14:07:57.413226 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:57 crc kubenswrapper[4912]: I0318 14:07:57.885160 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:07:57 crc kubenswrapper[4912]: I0318 14:07:57.957587 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:07:59 crc kubenswrapper[4912]: I0318 14:07:59.856273 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wnpsm" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="registry-server" containerID="cri-o://ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4" gracePeriod=2 Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.181676 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564048-cf8gc"] Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.183969 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.187674 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.187713 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.187767 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.195281 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-cf8gc"] Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.261491 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czj7\" (UniqueName: \"kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7\") pod \"auto-csr-approver-29564048-cf8gc\" (UID: \"4f9853e4-ff99-44df-89f4-72113fe0c319\") " pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.370312 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czj7\" (UniqueName: \"kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7\") pod \"auto-csr-approver-29564048-cf8gc\" (UID: \"4f9853e4-ff99-44df-89f4-72113fe0c319\") " pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.397534 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czj7\" (UniqueName: \"kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7\") pod \"auto-csr-approver-29564048-cf8gc\" (UID: \"4f9853e4-ff99-44df-89f4-72113fe0c319\") " pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.518696 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.727207 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.869302 4912 generic.go:334] "Generic (PLEG): container finished" podID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerID="ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4" exitCode=0 Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.869382 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerDied","Data":"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4"} Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.869736 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnpsm" event={"ID":"f051a6f2-0c54-4f4d-a583-3bd5da3085d5","Type":"ContainerDied","Data":"8d165affffd590aacf40ff60429758a8fcaa2908d72d911df35275b2c6a9238c"} Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.869770 4912 scope.go:117] "RemoveContainer" containerID="ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.869464 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnpsm" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.903661 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content\") pod \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.903782 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities\") pod \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.903907 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8qv9\" (UniqueName: \"kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9\") pod \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\" (UID: \"f051a6f2-0c54-4f4d-a583-3bd5da3085d5\") " Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.905176 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities" (OuterVolumeSpecName: "utilities") pod "f051a6f2-0c54-4f4d-a583-3bd5da3085d5" (UID: "f051a6f2-0c54-4f4d-a583-3bd5da3085d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.908203 4912 scope.go:117] "RemoveContainer" containerID="8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.932740 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f051a6f2-0c54-4f4d-a583-3bd5da3085d5" (UID: "f051a6f2-0c54-4f4d-a583-3bd5da3085d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.933532 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9" (OuterVolumeSpecName: "kube-api-access-s8qv9") pod "f051a6f2-0c54-4f4d-a583-3bd5da3085d5" (UID: "f051a6f2-0c54-4f4d-a583-3bd5da3085d5"). InnerVolumeSpecName "kube-api-access-s8qv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:08:00 crc kubenswrapper[4912]: I0318 14:08:00.993005 4912 scope.go:117] "RemoveContainer" containerID="a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.007433 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.007481 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.007495 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8qv9\" (UniqueName: \"kubernetes.io/projected/f051a6f2-0c54-4f4d-a583-3bd5da3085d5-kube-api-access-s8qv9\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.053285 4912 scope.go:117] "RemoveContainer" containerID="ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4" Mar 18 14:08:01 crc kubenswrapper[4912]: E0318 14:08:01.053784 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4\": container with ID starting with ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4 not found: ID does not exist" containerID="ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.053833 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4"} err="failed to get container status \"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4\": rpc error: code = NotFound desc = could not find container \"ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4\": container with ID starting with ecacd0272fcadd0b831f5d2ff5cc55949b503c2d95887ab12ace970bc7327dc4 not found: ID does not exist" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.053867 4912 scope.go:117] "RemoveContainer" containerID="8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3" Mar 18 14:08:01 crc kubenswrapper[4912]: E0318 14:08:01.054229 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3\": container with ID starting with 8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3 not found: ID does not exist" containerID="8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.054256 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3"} err="failed to get container status \"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3\": rpc error: code = NotFound desc = could not find container \"8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3\": container with ID starting with 8580921806ba7a9e8ace7050fa66854c79decb07fc2b2fcc3a2ae73a9f24d8c3 not found: ID does not exist" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.054271 4912 scope.go:117] "RemoveContainer" containerID="a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a" Mar 18 14:08:01 crc kubenswrapper[4912]: E0318 14:08:01.054554 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a\": container with ID starting with a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a not found: ID does not exist" containerID="a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.054599 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a"} err="failed to get container status \"a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a\": rpc error: code = NotFound desc = could not find container \"a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a\": container with ID starting with a26b730618404a232592d4a7e984e4358756eb5aece8d9da46f41651e3c79c1a not found: ID does not exist" Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.084734 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-cf8gc"] Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.217809 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.230418 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnpsm"] Mar 18 14:08:01 crc kubenswrapper[4912]: I0318 14:08:01.883657 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" event={"ID":"4f9853e4-ff99-44df-89f4-72113fe0c319","Type":"ContainerStarted","Data":"d1a3c904a751e6be95297a8f149208987641e118a713c488f2977f95c783d65f"} Mar 18 14:08:02 crc kubenswrapper[4912]: I0318 14:08:02.244361 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" path="/var/lib/kubelet/pods/f051a6f2-0c54-4f4d-a583-3bd5da3085d5/volumes" Mar 18 14:08:02 crc kubenswrapper[4912]: I0318 14:08:02.900049 4912 generic.go:334] "Generic (PLEG): container finished" podID="4f9853e4-ff99-44df-89f4-72113fe0c319" containerID="f40225a9a8f4201285c97a2a2cdfef917bfb3258ed2723d3811b30e3bb0b6e19" exitCode=0 Mar 18 14:08:02 crc kubenswrapper[4912]: I0318 14:08:02.900817 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" event={"ID":"4f9853e4-ff99-44df-89f4-72113fe0c319","Type":"ContainerDied","Data":"f40225a9a8f4201285c97a2a2cdfef917bfb3258ed2723d3811b30e3bb0b6e19"} Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.459092 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.611677 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4czj7\" (UniqueName: \"kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7\") pod \"4f9853e4-ff99-44df-89f4-72113fe0c319\" (UID: \"4f9853e4-ff99-44df-89f4-72113fe0c319\") " Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.619635 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7" (OuterVolumeSpecName: "kube-api-access-4czj7") pod "4f9853e4-ff99-44df-89f4-72113fe0c319" (UID: "4f9853e4-ff99-44df-89f4-72113fe0c319"). InnerVolumeSpecName "kube-api-access-4czj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.715503 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4czj7\" (UniqueName: \"kubernetes.io/projected/4f9853e4-ff99-44df-89f4-72113fe0c319-kube-api-access-4czj7\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.929749 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" event={"ID":"4f9853e4-ff99-44df-89f4-72113fe0c319","Type":"ContainerDied","Data":"d1a3c904a751e6be95297a8f149208987641e118a713c488f2977f95c783d65f"} Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.929799 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a3c904a751e6be95297a8f149208987641e118a713c488f2977f95c783d65f" Mar 18 14:08:04 crc kubenswrapper[4912]: I0318 14:08:04.930113 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-cf8gc" Mar 18 14:08:05 crc kubenswrapper[4912]: I0318 14:08:05.553065 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-9c8xl"] Mar 18 14:08:05 crc kubenswrapper[4912]: I0318 14:08:05.567161 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-9c8xl"] Mar 18 14:08:06 crc kubenswrapper[4912]: I0318 14:08:06.242680 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36bb26ac-5ec1-404c-8faf-3c5d40bb699d" path="/var/lib/kubelet/pods/36bb26ac-5ec1-404c-8faf-3c5d40bb699d/volumes" Mar 18 14:08:06 crc kubenswrapper[4912]: I0318 14:08:06.999214 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:08:06 crc kubenswrapper[4912]: I0318 14:08:06.999307 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:08:06 crc kubenswrapper[4912]: I0318 14:08:06.999379 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.001093 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.001170 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" gracePeriod=600 Mar 18 14:08:07 crc kubenswrapper[4912]: E0318 14:08:07.125804 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.979951 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" exitCode=0 Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.980164 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3"} Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.980436 4912 scope.go:117] "RemoveContainer" containerID="08e9b8ae719ab48acec11413976ee77a91412253b2f23af52395c546c6c4da43" Mar 18 14:08:07 crc kubenswrapper[4912]: I0318 14:08:07.981906 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:08:07 crc kubenswrapper[4912]: E0318 14:08:07.982760 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:08:20 crc kubenswrapper[4912]: I0318 14:08:20.228779 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:08:20 crc kubenswrapper[4912]: E0318 14:08:20.229781 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:08:34 crc kubenswrapper[4912]: I0318 14:08:34.228675 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:08:34 crc kubenswrapper[4912]: E0318 14:08:34.229446 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:08:46 crc kubenswrapper[4912]: I0318 14:08:46.228964 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:08:46 crc kubenswrapper[4912]: E0318 14:08:46.230131 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:08:58 crc kubenswrapper[4912]: I0318 14:08:58.228783 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:08:58 crc kubenswrapper[4912]: E0318 14:08:58.229953 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:09:05 crc kubenswrapper[4912]: I0318 14:09:05.690500 4912 scope.go:117] "RemoveContainer" containerID="e261d92a7ff0e096355713ef04c6f12ec770dda9a5317cd2eb9bd3fa98915fc6" Mar 18 14:09:12 crc kubenswrapper[4912]: I0318 14:09:12.236620 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:09:12 crc kubenswrapper[4912]: E0318 14:09:12.237680 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:09:23 crc kubenswrapper[4912]: I0318 14:09:23.229193 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:09:23 crc kubenswrapper[4912]: E0318 14:09:23.231659 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:09:37 crc kubenswrapper[4912]: I0318 14:09:37.228092 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:09:37 crc kubenswrapper[4912]: E0318 14:09:37.229156 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:09:49 crc kubenswrapper[4912]: I0318 14:09:49.228880 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:09:49 crc kubenswrapper[4912]: E0318 14:09:49.230200 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.153452 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564050-9srqk"] Mar 18 14:10:00 crc kubenswrapper[4912]: E0318 14:10:00.154899 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="extract-content" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.154914 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="extract-content" Mar 18 14:10:00 crc kubenswrapper[4912]: E0318 14:10:00.154929 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="registry-server" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.154935 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="registry-server" Mar 18 14:10:00 crc kubenswrapper[4912]: E0318 14:10:00.154984 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="extract-utilities" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.154992 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="extract-utilities" Mar 18 14:10:00 crc kubenswrapper[4912]: E0318 14:10:00.155030 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9853e4-ff99-44df-89f4-72113fe0c319" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.155057 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9853e4-ff99-44df-89f4-72113fe0c319" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.155301 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f051a6f2-0c54-4f4d-a583-3bd5da3085d5" containerName="registry-server" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.155312 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9853e4-ff99-44df-89f4-72113fe0c319" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.156242 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.159567 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.159705 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.159738 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.261318 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-9srqk"] Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.266975 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7jb\" (UniqueName: \"kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb\") pod \"auto-csr-approver-29564050-9srqk\" (UID: \"68102db5-4e1b-4ca8-8fa5-00b054ceebd6\") " pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.370307 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7jb\" (UniqueName: \"kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb\") pod \"auto-csr-approver-29564050-9srqk\" (UID: \"68102db5-4e1b-4ca8-8fa5-00b054ceebd6\") " pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.392885 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7jb\" (UniqueName: \"kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb\") pod \"auto-csr-approver-29564050-9srqk\" (UID: \"68102db5-4e1b-4ca8-8fa5-00b054ceebd6\") " pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:00 crc kubenswrapper[4912]: I0318 14:10:00.479325 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:01 crc kubenswrapper[4912]: I0318 14:10:01.043956 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-9srqk"] Mar 18 14:10:01 crc kubenswrapper[4912]: I0318 14:10:01.235520 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:10:01 crc kubenswrapper[4912]: E0318 14:10:01.236298 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:01 crc kubenswrapper[4912]: I0318 14:10:01.492237 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-9srqk" event={"ID":"68102db5-4e1b-4ca8-8fa5-00b054ceebd6","Type":"ContainerStarted","Data":"97c468e9586c7d36f9ac5aacbecd4e0ef0f541cd1f6f3eb7d6190321e8f55afb"} Mar 18 14:10:02 crc kubenswrapper[4912]: I0318 14:10:02.505099 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-9srqk" event={"ID":"68102db5-4e1b-4ca8-8fa5-00b054ceebd6","Type":"ContainerStarted","Data":"d8eb327b797ab247c84e3e87bd5a6e411dcf1449d397926adfc029d2f3ebaba2"} Mar 18 14:10:02 crc kubenswrapper[4912]: I0318 14:10:02.542981 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564050-9srqk" podStartSLOduration=1.5315904150000001 podStartE2EDuration="2.542958158s" podCreationTimestamp="2026-03-18 14:10:00 +0000 UTC" firstStartedPulling="2026-03-18 14:10:01.050532357 +0000 UTC m=+4049.509959772" lastFinishedPulling="2026-03-18 14:10:02.06190009 +0000 UTC m=+4050.521327515" observedRunningTime="2026-03-18 14:10:02.527688234 +0000 UTC m=+4050.987115659" watchObservedRunningTime="2026-03-18 14:10:02.542958158 +0000 UTC m=+4051.002385583" Mar 18 14:10:03 crc kubenswrapper[4912]: I0318 14:10:03.517600 4912 generic.go:334] "Generic (PLEG): container finished" podID="68102db5-4e1b-4ca8-8fa5-00b054ceebd6" containerID="d8eb327b797ab247c84e3e87bd5a6e411dcf1449d397926adfc029d2f3ebaba2" exitCode=0 Mar 18 14:10:03 crc kubenswrapper[4912]: I0318 14:10:03.517748 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-9srqk" event={"ID":"68102db5-4e1b-4ca8-8fa5-00b054ceebd6","Type":"ContainerDied","Data":"d8eb327b797ab247c84e3e87bd5a6e411dcf1449d397926adfc029d2f3ebaba2"} Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.553128 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-9srqk" event={"ID":"68102db5-4e1b-4ca8-8fa5-00b054ceebd6","Type":"ContainerDied","Data":"97c468e9586c7d36f9ac5aacbecd4e0ef0f541cd1f6f3eb7d6190321e8f55afb"} Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.553631 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c468e9586c7d36f9ac5aacbecd4e0ef0f541cd1f6f3eb7d6190321e8f55afb" Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.683999 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.863980 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z7jb\" (UniqueName: \"kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb\") pod \"68102db5-4e1b-4ca8-8fa5-00b054ceebd6\" (UID: \"68102db5-4e1b-4ca8-8fa5-00b054ceebd6\") " Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.875353 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb" (OuterVolumeSpecName: "kube-api-access-5z7jb") pod "68102db5-4e1b-4ca8-8fa5-00b054ceebd6" (UID: "68102db5-4e1b-4ca8-8fa5-00b054ceebd6"). InnerVolumeSpecName "kube-api-access-5z7jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:10:05 crc kubenswrapper[4912]: I0318 14:10:05.967924 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z7jb\" (UniqueName: \"kubernetes.io/projected/68102db5-4e1b-4ca8-8fa5-00b054ceebd6-kube-api-access-5z7jb\") on node \"crc\" DevicePath \"\"" Mar 18 14:10:06 crc kubenswrapper[4912]: I0318 14:10:06.565712 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-9srqk" Mar 18 14:10:06 crc kubenswrapper[4912]: I0318 14:10:06.786429 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-2lg7g"] Mar 18 14:10:06 crc kubenswrapper[4912]: I0318 14:10:06.799752 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-2lg7g"] Mar 18 14:10:08 crc kubenswrapper[4912]: I0318 14:10:08.243810 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99debac2-992e-4f94-8f0f-d1c206348a7a" path="/var/lib/kubelet/pods/99debac2-992e-4f94-8f0f-d1c206348a7a/volumes" Mar 18 14:10:13 crc kubenswrapper[4912]: I0318 14:10:13.228552 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:10:13 crc kubenswrapper[4912]: E0318 14:10:13.230159 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:25 crc kubenswrapper[4912]: I0318 14:10:25.227994 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:10:25 crc kubenswrapper[4912]: E0318 14:10:25.229270 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:38 crc kubenswrapper[4912]: I0318 14:10:38.229214 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:10:38 crc kubenswrapper[4912]: E0318 14:10:38.230562 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:49 crc kubenswrapper[4912]: I0318 14:10:49.228618 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:10:49 crc kubenswrapper[4912]: E0318 14:10:49.229853 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.832149 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:10:54 crc kubenswrapper[4912]: E0318 14:10:54.834415 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68102db5-4e1b-4ca8-8fa5-00b054ceebd6" containerName="oc" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.834438 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="68102db5-4e1b-4ca8-8fa5-00b054ceebd6" containerName="oc" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.835170 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="68102db5-4e1b-4ca8-8fa5-00b054ceebd6" containerName="oc" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.857857 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.871875 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.900371 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.901191 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hrq\" (UniqueName: \"kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:54 crc kubenswrapper[4912]: I0318 14:10:54.901334 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.003922 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hrq\" (UniqueName: \"kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.004031 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.004117 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.004868 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.005710 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.029183 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hrq\" (UniqueName: \"kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq\") pod \"community-operators-rkdkv\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.195173 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:10:55 crc kubenswrapper[4912]: I0318 14:10:55.793002 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:10:56 crc kubenswrapper[4912]: I0318 14:10:56.200238 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerID="cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d" exitCode=0 Mar 18 14:10:56 crc kubenswrapper[4912]: I0318 14:10:56.200766 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerDied","Data":"cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d"} Mar 18 14:10:56 crc kubenswrapper[4912]: I0318 14:10:56.200799 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerStarted","Data":"aa4d25e2b10d969e58f16c4b366914cb6d8bf52c58f3d7508c55273c46898199"} Mar 18 14:10:58 crc kubenswrapper[4912]: I0318 14:10:58.261519 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerStarted","Data":"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da"} Mar 18 14:10:59 crc kubenswrapper[4912]: I0318 14:10:59.242727 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerID="6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da" exitCode=0 Mar 18 14:10:59 crc kubenswrapper[4912]: I0318 14:10:59.242814 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerDied","Data":"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da"} Mar 18 14:11:00 crc kubenswrapper[4912]: I0318 14:11:00.264369 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerStarted","Data":"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f"} Mar 18 14:11:00 crc kubenswrapper[4912]: I0318 14:11:00.298641 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rkdkv" podStartSLOduration=2.621681434 podStartE2EDuration="6.298598867s" podCreationTimestamp="2026-03-18 14:10:54 +0000 UTC" firstStartedPulling="2026-03-18 14:10:56.202377281 +0000 UTC m=+4104.661804706" lastFinishedPulling="2026-03-18 14:10:59.879294714 +0000 UTC m=+4108.338722139" observedRunningTime="2026-03-18 14:11:00.294004202 +0000 UTC m=+4108.753431627" watchObservedRunningTime="2026-03-18 14:11:00.298598867 +0000 UTC m=+4108.758026302" Mar 18 14:11:01 crc kubenswrapper[4912]: I0318 14:11:01.229617 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:11:01 crc kubenswrapper[4912]: E0318 14:11:01.231145 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.196565 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.229246 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.285112 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.395261 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.531520 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:11:05 crc kubenswrapper[4912]: I0318 14:11:05.802514 4912 scope.go:117] "RemoveContainer" containerID="29c588cc977d164cd989e7cdd5b9f2693c4aa5347ffc6851045165e27c699583" Mar 18 14:11:07 crc kubenswrapper[4912]: I0318 14:11:07.361821 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rkdkv" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="registry-server" containerID="cri-o://698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f" gracePeriod=2 Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.054831 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.245332 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities\") pod \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.245565 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content\") pod \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.245823 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hrq\" (UniqueName: \"kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq\") pod \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\" (UID: \"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6\") " Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.246690 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities" (OuterVolumeSpecName: "utilities") pod "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" (UID: "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.256717 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq" (OuterVolumeSpecName: "kube-api-access-n5hrq") pod "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" (UID: "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6"). InnerVolumeSpecName "kube-api-access-n5hrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.303212 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" (UID: "8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.349904 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.349960 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.349975 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hrq\" (UniqueName: \"kubernetes.io/projected/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6-kube-api-access-n5hrq\") on node \"crc\" DevicePath \"\"" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.380418 4912 generic.go:334] "Generic (PLEG): container finished" podID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerID="698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f" exitCode=0 Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.380485 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerDied","Data":"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f"} Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.380496 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkdkv" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.380527 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkdkv" event={"ID":"8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6","Type":"ContainerDied","Data":"aa4d25e2b10d969e58f16c4b366914cb6d8bf52c58f3d7508c55273c46898199"} Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.380549 4912 scope.go:117] "RemoveContainer" containerID="698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.435055 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.445679 4912 scope.go:117] "RemoveContainer" containerID="6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.455637 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rkdkv"] Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.479605 4912 scope.go:117] "RemoveContainer" containerID="cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.559332 4912 scope.go:117] "RemoveContainer" containerID="698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f" Mar 18 14:11:08 crc kubenswrapper[4912]: E0318 14:11:08.560313 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f\": container with ID starting with 698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f not found: ID does not exist" containerID="698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.560362 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f"} err="failed to get container status \"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f\": rpc error: code = NotFound desc = could not find container \"698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f\": container with ID starting with 698864deb55c11789930d2d8d81bc1b8a24d2350626e1e46c3bbc5667543c40f not found: ID does not exist" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.560408 4912 scope.go:117] "RemoveContainer" containerID="6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da" Mar 18 14:11:08 crc kubenswrapper[4912]: E0318 14:11:08.560952 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da\": container with ID starting with 6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da not found: ID does not exist" containerID="6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.561009 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da"} err="failed to get container status \"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da\": rpc error: code = NotFound desc = could not find container \"6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da\": container with ID starting with 6a2cbc5da72000d44f111bb4aa896a54c1951c2fa3d49d97127d68086f3046da not found: ID does not exist" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.561058 4912 scope.go:117] "RemoveContainer" containerID="cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d" Mar 18 14:11:08 crc kubenswrapper[4912]: E0318 14:11:08.561417 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d\": container with ID starting with cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d not found: ID does not exist" containerID="cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d" Mar 18 14:11:08 crc kubenswrapper[4912]: I0318 14:11:08.561449 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d"} err="failed to get container status \"cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d\": rpc error: code = NotFound desc = could not find container \"cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d\": container with ID starting with cc53e41370a3afb77de67e95f08f09f99fb8aa5cb2c99512be7675e2ac93af0d not found: ID does not exist" Mar 18 14:11:10 crc kubenswrapper[4912]: I0318 14:11:10.265622 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" path="/var/lib/kubelet/pods/8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6/volumes" Mar 18 14:11:16 crc kubenswrapper[4912]: I0318 14:11:16.229086 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:11:16 crc kubenswrapper[4912]: E0318 14:11:16.230289 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:11:27 crc kubenswrapper[4912]: I0318 14:11:27.228951 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:11:27 crc kubenswrapper[4912]: E0318 14:11:27.230456 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:11:41 crc kubenswrapper[4912]: I0318 14:11:41.229541 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:11:41 crc kubenswrapper[4912]: E0318 14:11:41.230577 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:11:56 crc kubenswrapper[4912]: I0318 14:11:56.228676 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:11:56 crc kubenswrapper[4912]: E0318 14:11:56.230077 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.167678 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564052-8ljkx"] Mar 18 14:12:00 crc kubenswrapper[4912]: E0318 14:12:00.170529 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="extract-content" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.170641 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="extract-content" Mar 18 14:12:00 crc kubenswrapper[4912]: E0318 14:12:00.170766 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="registry-server" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.170840 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="registry-server" Mar 18 14:12:00 crc kubenswrapper[4912]: E0318 14:12:00.170917 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="extract-utilities" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.170972 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="extract-utilities" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.171391 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5b4f92-f6e5-4f0e-8450-8bfdbdce18c6" containerName="registry-server" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.172673 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.175396 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.175895 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.178316 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.183419 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-8ljkx"] Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.248459 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjw5z\" (UniqueName: \"kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z\") pod \"auto-csr-approver-29564052-8ljkx\" (UID: \"15617ee9-18d6-42e1-b207-cbbfc9f938ca\") " pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.352230 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjw5z\" (UniqueName: \"kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z\") pod \"auto-csr-approver-29564052-8ljkx\" (UID: \"15617ee9-18d6-42e1-b207-cbbfc9f938ca\") " pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.377105 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjw5z\" (UniqueName: \"kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z\") pod \"auto-csr-approver-29564052-8ljkx\" (UID: \"15617ee9-18d6-42e1-b207-cbbfc9f938ca\") " pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:00 crc kubenswrapper[4912]: I0318 14:12:00.504479 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:01 crc kubenswrapper[4912]: I0318 14:12:01.049809 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-8ljkx"] Mar 18 14:12:01 crc kubenswrapper[4912]: I0318 14:12:01.058022 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:12:01 crc kubenswrapper[4912]: I0318 14:12:01.127465 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" event={"ID":"15617ee9-18d6-42e1-b207-cbbfc9f938ca","Type":"ContainerStarted","Data":"70ccc4a779599e0f6d609dc48f6af46e8e652a37e74bcc12b6f568c13c95c6f4"} Mar 18 14:12:03 crc kubenswrapper[4912]: I0318 14:12:03.162618 4912 generic.go:334] "Generic (PLEG): container finished" podID="15617ee9-18d6-42e1-b207-cbbfc9f938ca" containerID="0005edb35531e76c6fac29a3b3a653614f1cd9151f38c59af8346c51f5b72747" exitCode=0 Mar 18 14:12:03 crc kubenswrapper[4912]: I0318 14:12:03.162702 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" event={"ID":"15617ee9-18d6-42e1-b207-cbbfc9f938ca","Type":"ContainerDied","Data":"0005edb35531e76c6fac29a3b3a653614f1cd9151f38c59af8346c51f5b72747"} Mar 18 14:12:04 crc kubenswrapper[4912]: I0318 14:12:04.691723 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:04 crc kubenswrapper[4912]: I0318 14:12:04.720493 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjw5z\" (UniqueName: \"kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z\") pod \"15617ee9-18d6-42e1-b207-cbbfc9f938ca\" (UID: \"15617ee9-18d6-42e1-b207-cbbfc9f938ca\") " Mar 18 14:12:04 crc kubenswrapper[4912]: I0318 14:12:04.744428 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z" (OuterVolumeSpecName: "kube-api-access-kjw5z") pod "15617ee9-18d6-42e1-b207-cbbfc9f938ca" (UID: "15617ee9-18d6-42e1-b207-cbbfc9f938ca"). InnerVolumeSpecName "kube-api-access-kjw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:12:04 crc kubenswrapper[4912]: I0318 14:12:04.824715 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjw5z\" (UniqueName: \"kubernetes.io/projected/15617ee9-18d6-42e1-b207-cbbfc9f938ca-kube-api-access-kjw5z\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:05 crc kubenswrapper[4912]: I0318 14:12:05.196489 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" Mar 18 14:12:05 crc kubenswrapper[4912]: I0318 14:12:05.196437 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-8ljkx" event={"ID":"15617ee9-18d6-42e1-b207-cbbfc9f938ca","Type":"ContainerDied","Data":"70ccc4a779599e0f6d609dc48f6af46e8e652a37e74bcc12b6f568c13c95c6f4"} Mar 18 14:12:05 crc kubenswrapper[4912]: I0318 14:12:05.196593 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ccc4a779599e0f6d609dc48f6af46e8e652a37e74bcc12b6f568c13c95c6f4" Mar 18 14:12:05 crc kubenswrapper[4912]: I0318 14:12:05.793226 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-ghdb9"] Mar 18 14:12:05 crc kubenswrapper[4912]: I0318 14:12:05.814767 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-ghdb9"] Mar 18 14:12:06 crc kubenswrapper[4912]: I0318 14:12:06.247177 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c1ce29-8a72-4ad2-b226-f39747eba1c2" path="/var/lib/kubelet/pods/b8c1ce29-8a72-4ad2-b226-f39747eba1c2/volumes" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.211633 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:08 crc kubenswrapper[4912]: E0318 14:12:08.213420 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15617ee9-18d6-42e1-b207-cbbfc9f938ca" containerName="oc" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.213443 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="15617ee9-18d6-42e1-b207-cbbfc9f938ca" containerName="oc" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.213857 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="15617ee9-18d6-42e1-b207-cbbfc9f938ca" containerName="oc" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.216565 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.257137 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.263072 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9wjj\" (UniqueName: \"kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.263382 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.263653 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.366602 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9wjj\" (UniqueName: \"kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.366833 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.366896 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.369302 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.369479 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.408878 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9wjj\" (UniqueName: \"kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj\") pod \"certified-operators-rnq8f\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:08 crc kubenswrapper[4912]: I0318 14:12:08.542565 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:09 crc kubenswrapper[4912]: I0318 14:12:09.156804 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:09 crc kubenswrapper[4912]: I0318 14:12:09.229195 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:12:09 crc kubenswrapper[4912]: E0318 14:12:09.229787 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:12:09 crc kubenswrapper[4912]: I0318 14:12:09.298499 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerStarted","Data":"cd3cead2c85f1cbcd231a8097244c1069fb936fc684766dc5453d29a5f374ca4"} Mar 18 14:12:10 crc kubenswrapper[4912]: I0318 14:12:10.314000 4912 generic.go:334] "Generic (PLEG): container finished" podID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerID="a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc" exitCode=0 Mar 18 14:12:10 crc kubenswrapper[4912]: I0318 14:12:10.314145 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerDied","Data":"a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc"} Mar 18 14:12:11 crc kubenswrapper[4912]: I0318 14:12:11.330497 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerStarted","Data":"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359"} Mar 18 14:12:13 crc kubenswrapper[4912]: I0318 14:12:13.360827 4912 generic.go:334] "Generic (PLEG): container finished" podID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerID="626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359" exitCode=0 Mar 18 14:12:13 crc kubenswrapper[4912]: I0318 14:12:13.360912 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerDied","Data":"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359"} Mar 18 14:12:16 crc kubenswrapper[4912]: I0318 14:12:16.401665 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerStarted","Data":"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23"} Mar 18 14:12:16 crc kubenswrapper[4912]: I0318 14:12:16.424032 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rnq8f" podStartSLOduration=3.96627784 podStartE2EDuration="8.424010351s" podCreationTimestamp="2026-03-18 14:12:08 +0000 UTC" firstStartedPulling="2026-03-18 14:12:10.317269943 +0000 UTC m=+4178.776697368" lastFinishedPulling="2026-03-18 14:12:14.775002454 +0000 UTC m=+4183.234429879" observedRunningTime="2026-03-18 14:12:16.421877733 +0000 UTC m=+4184.881305158" watchObservedRunningTime="2026-03-18 14:12:16.424010351 +0000 UTC m=+4184.883437776" Mar 18 14:12:18 crc kubenswrapper[4912]: I0318 14:12:18.543240 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:18 crc kubenswrapper[4912]: I0318 14:12:18.545398 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:18 crc kubenswrapper[4912]: I0318 14:12:18.600421 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:20 crc kubenswrapper[4912]: I0318 14:12:20.512903 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:20 crc kubenswrapper[4912]: I0318 14:12:20.578828 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:22 crc kubenswrapper[4912]: I0318 14:12:22.237422 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:12:22 crc kubenswrapper[4912]: E0318 14:12:22.238305 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:12:22 crc kubenswrapper[4912]: I0318 14:12:22.480685 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rnq8f" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="registry-server" containerID="cri-o://7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23" gracePeriod=2 Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.073128 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.170872 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content\") pod \"3b95ebd7-f92b-4d85-a364-1e1433671b03\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.171261 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities\") pod \"3b95ebd7-f92b-4d85-a364-1e1433671b03\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.171476 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9wjj\" (UniqueName: \"kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj\") pod \"3b95ebd7-f92b-4d85-a364-1e1433671b03\" (UID: \"3b95ebd7-f92b-4d85-a364-1e1433671b03\") " Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.176900 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities" (OuterVolumeSpecName: "utilities") pod "3b95ebd7-f92b-4d85-a364-1e1433671b03" (UID: "3b95ebd7-f92b-4d85-a364-1e1433671b03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.179560 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj" (OuterVolumeSpecName: "kube-api-access-j9wjj") pod "3b95ebd7-f92b-4d85-a364-1e1433671b03" (UID: "3b95ebd7-f92b-4d85-a364-1e1433671b03"). InnerVolumeSpecName "kube-api-access-j9wjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.230364 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b95ebd7-f92b-4d85-a364-1e1433671b03" (UID: "3b95ebd7-f92b-4d85-a364-1e1433671b03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.275482 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.276701 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9wjj\" (UniqueName: \"kubernetes.io/projected/3b95ebd7-f92b-4d85-a364-1e1433671b03-kube-api-access-j9wjj\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.276766 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b95ebd7-f92b-4d85-a364-1e1433671b03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.492474 4912 generic.go:334] "Generic (PLEG): container finished" podID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerID="7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23" exitCode=0 Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.492529 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerDied","Data":"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23"} Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.492576 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnq8f" event={"ID":"3b95ebd7-f92b-4d85-a364-1e1433671b03","Type":"ContainerDied","Data":"cd3cead2c85f1cbcd231a8097244c1069fb936fc684766dc5453d29a5f374ca4"} Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.492602 4912 scope.go:117] "RemoveContainer" containerID="7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.492612 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnq8f" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.522526 4912 scope.go:117] "RemoveContainer" containerID="626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.553690 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.562409 4912 scope.go:117] "RemoveContainer" containerID="a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.573973 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rnq8f"] Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.617413 4912 scope.go:117] "RemoveContainer" containerID="7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23" Mar 18 14:12:23 crc kubenswrapper[4912]: E0318 14:12:23.617997 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23\": container with ID starting with 7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23 not found: ID does not exist" containerID="7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.618063 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23"} err="failed to get container status \"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23\": rpc error: code = NotFound desc = could not find container \"7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23\": container with ID starting with 7c0b67737e5347c3ebe573ea9f2857c4733599f5b068b42f0f53d1cee32ddf23 not found: ID does not exist" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.618088 4912 scope.go:117] "RemoveContainer" containerID="626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359" Mar 18 14:12:23 crc kubenswrapper[4912]: E0318 14:12:23.618793 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359\": container with ID starting with 626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359 not found: ID does not exist" containerID="626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.618829 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359"} err="failed to get container status \"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359\": rpc error: code = NotFound desc = could not find container \"626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359\": container with ID starting with 626d76d4057f1b3a6838b8b6da5dd675e3558132850c50df60e2963297894359 not found: ID does not exist" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.618844 4912 scope.go:117] "RemoveContainer" containerID="a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc" Mar 18 14:12:23 crc kubenswrapper[4912]: E0318 14:12:23.619110 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc\": container with ID starting with a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc not found: ID does not exist" containerID="a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc" Mar 18 14:12:23 crc kubenswrapper[4912]: I0318 14:12:23.619140 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc"} err="failed to get container status \"a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc\": rpc error: code = NotFound desc = could not find container \"a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc\": container with ID starting with a78baaa4eb440be094abc94ef43ed1204ec4206b219d2585f86cc3b9c41db2bc not found: ID does not exist" Mar 18 14:12:24 crc kubenswrapper[4912]: I0318 14:12:24.244946 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" path="/var/lib/kubelet/pods/3b95ebd7-f92b-4d85-a364-1e1433671b03/volumes" Mar 18 14:12:34 crc kubenswrapper[4912]: I0318 14:12:34.231871 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:12:34 crc kubenswrapper[4912]: E0318 14:12:34.237691 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:12:48 crc kubenswrapper[4912]: I0318 14:12:48.230805 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:12:48 crc kubenswrapper[4912]: E0318 14:12:48.232767 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:13:03 crc kubenswrapper[4912]: I0318 14:13:03.229265 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:13:03 crc kubenswrapper[4912]: E0318 14:13:03.232413 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:13:05 crc kubenswrapper[4912]: I0318 14:13:05.954960 4912 scope.go:117] "RemoveContainer" containerID="604d177f2038a60e87d61e60f1217bebbf0482d64195eda499bf097ca909395c" Mar 18 14:13:14 crc kubenswrapper[4912]: I0318 14:13:14.232367 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:13:15 crc kubenswrapper[4912]: I0318 14:13:15.249074 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d"} Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.161382 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564054-cbdc8"] Mar 18 14:14:00 crc kubenswrapper[4912]: E0318 14:14:00.162687 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="extract-utilities" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.162717 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="extract-utilities" Mar 18 14:14:00 crc kubenswrapper[4912]: E0318 14:14:00.162750 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="registry-server" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.162759 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="registry-server" Mar 18 14:14:00 crc kubenswrapper[4912]: E0318 14:14:00.162794 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="extract-content" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.162803 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="extract-content" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.163174 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b95ebd7-f92b-4d85-a364-1e1433671b03" containerName="registry-server" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.164776 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.166848 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.171394 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.172229 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.181988 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-cbdc8"] Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.270418 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xft\" (UniqueName: \"kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft\") pod \"auto-csr-approver-29564054-cbdc8\" (UID: \"7e6b52b0-2b67-4262-b045-3cb58a8c9cda\") " pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.373230 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84xft\" (UniqueName: \"kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft\") pod \"auto-csr-approver-29564054-cbdc8\" (UID: \"7e6b52b0-2b67-4262-b045-3cb58a8c9cda\") " pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.415134 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xft\" (UniqueName: \"kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft\") pod \"auto-csr-approver-29564054-cbdc8\" (UID: \"7e6b52b0-2b67-4262-b045-3cb58a8c9cda\") " pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:00 crc kubenswrapper[4912]: I0318 14:14:00.503235 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:01 crc kubenswrapper[4912]: I0318 14:14:01.057765 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-cbdc8"] Mar 18 14:14:01 crc kubenswrapper[4912]: I0318 14:14:01.818357 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" event={"ID":"7e6b52b0-2b67-4262-b045-3cb58a8c9cda","Type":"ContainerStarted","Data":"363b714f6426ed8dc3ffdeb4cbb505d2d560836e5be2fd318910f949a90e6b7a"} Mar 18 14:14:02 crc kubenswrapper[4912]: I0318 14:14:02.845327 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" event={"ID":"7e6b52b0-2b67-4262-b045-3cb58a8c9cda","Type":"ContainerStarted","Data":"83a52a16289a3cdef81a88a6f3a89057be675f747d9df87b4092a8bd128fbce9"} Mar 18 14:14:02 crc kubenswrapper[4912]: I0318 14:14:02.876324 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" podStartSLOduration=1.6198845130000001 podStartE2EDuration="2.876030355s" podCreationTimestamp="2026-03-18 14:14:00 +0000 UTC" firstStartedPulling="2026-03-18 14:14:01.0568128 +0000 UTC m=+4289.516240225" lastFinishedPulling="2026-03-18 14:14:02.312958642 +0000 UTC m=+4290.772386067" observedRunningTime="2026-03-18 14:14:02.870204417 +0000 UTC m=+4291.329631862" watchObservedRunningTime="2026-03-18 14:14:02.876030355 +0000 UTC m=+4291.335457780" Mar 18 14:14:03 crc kubenswrapper[4912]: I0318 14:14:03.859287 4912 generic.go:334] "Generic (PLEG): container finished" podID="7e6b52b0-2b67-4262-b045-3cb58a8c9cda" containerID="83a52a16289a3cdef81a88a6f3a89057be675f747d9df87b4092a8bd128fbce9" exitCode=0 Mar 18 14:14:03 crc kubenswrapper[4912]: I0318 14:14:03.859355 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" event={"ID":"7e6b52b0-2b67-4262-b045-3cb58a8c9cda","Type":"ContainerDied","Data":"83a52a16289a3cdef81a88a6f3a89057be675f747d9df87b4092a8bd128fbce9"} Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.349325 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.372361 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84xft\" (UniqueName: \"kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft\") pod \"7e6b52b0-2b67-4262-b045-3cb58a8c9cda\" (UID: \"7e6b52b0-2b67-4262-b045-3cb58a8c9cda\") " Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.381291 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft" (OuterVolumeSpecName: "kube-api-access-84xft") pod "7e6b52b0-2b67-4262-b045-3cb58a8c9cda" (UID: "7e6b52b0-2b67-4262-b045-3cb58a8c9cda"). InnerVolumeSpecName "kube-api-access-84xft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.476815 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84xft\" (UniqueName: \"kubernetes.io/projected/7e6b52b0-2b67-4262-b045-3cb58a8c9cda-kube-api-access-84xft\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.900404 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" event={"ID":"7e6b52b0-2b67-4262-b045-3cb58a8c9cda","Type":"ContainerDied","Data":"363b714f6426ed8dc3ffdeb4cbb505d2d560836e5be2fd318910f949a90e6b7a"} Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.900745 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="363b714f6426ed8dc3ffdeb4cbb505d2d560836e5be2fd318910f949a90e6b7a" Mar 18 14:14:05 crc kubenswrapper[4912]: I0318 14:14:05.900529 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-cbdc8" Mar 18 14:14:06 crc kubenswrapper[4912]: I0318 14:14:06.438141 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-cf8gc"] Mar 18 14:14:06 crc kubenswrapper[4912]: I0318 14:14:06.453597 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-cf8gc"] Mar 18 14:14:08 crc kubenswrapper[4912]: I0318 14:14:08.244368 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9853e4-ff99-44df-89f4-72113fe0c319" path="/var/lib/kubelet/pods/4f9853e4-ff99-44df-89f4-72113fe0c319/volumes" Mar 18 14:14:28 crc kubenswrapper[4912]: I0318 14:14:28.184667 4912 trace.go:236] Trace[2068138154]: "Calculate volume metrics of storage for pod minio-dev/minio" (18-Mar-2026 14:14:26.961) (total time: 1220ms): Mar 18 14:14:28 crc kubenswrapper[4912]: Trace[2068138154]: [1.220330499s] [1.220330499s] END Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.160032 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz"] Mar 18 14:15:00 crc kubenswrapper[4912]: E0318 14:15:00.161395 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6b52b0-2b67-4262-b045-3cb58a8c9cda" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.161413 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6b52b0-2b67-4262-b045-3cb58a8c9cda" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.161817 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6b52b0-2b67-4262-b045-3cb58a8c9cda" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.163226 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.166406 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.167686 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.199924 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz"] Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.201718 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.202063 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4xd\" (UniqueName: \"kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.202389 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.312417 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.312944 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.314166 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4xd\" (UniqueName: \"kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.315121 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.319935 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.332549 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4xd\" (UniqueName: \"kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd\") pod \"collect-profiles-29564055-x8dqz\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.498728 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:00 crc kubenswrapper[4912]: I0318 14:15:00.974769 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz"] Mar 18 14:15:01 crc kubenswrapper[4912]: I0318 14:15:01.119694 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" event={"ID":"7ecdb20a-5ffb-438f-a684-7161ee45fea2","Type":"ContainerStarted","Data":"3a1baf83f698d1f10fd71208465941be52181903f47c61d0c46cdcd51f829a0f"} Mar 18 14:15:02 crc kubenswrapper[4912]: I0318 14:15:02.136839 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" event={"ID":"7ecdb20a-5ffb-438f-a684-7161ee45fea2","Type":"ContainerDied","Data":"9aeb0d040cce8af0f4f2da6423f4d40d796c135a8ed399caef7350fd5d183601"} Mar 18 14:15:02 crc kubenswrapper[4912]: I0318 14:15:02.136632 4912 generic.go:334] "Generic (PLEG): container finished" podID="7ecdb20a-5ffb-438f-a684-7161ee45fea2" containerID="9aeb0d040cce8af0f4f2da6423f4d40d796c135a8ed399caef7350fd5d183601" exitCode=0 Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.611647 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.645454 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume\") pod \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.645667 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume\") pod \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.645763 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v4xd\" (UniqueName: \"kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd\") pod \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\" (UID: \"7ecdb20a-5ffb-438f-a684-7161ee45fea2\") " Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.647527 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume" (OuterVolumeSpecName: "config-volume") pod "7ecdb20a-5ffb-438f-a684-7161ee45fea2" (UID: "7ecdb20a-5ffb-438f-a684-7161ee45fea2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.652451 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7ecdb20a-5ffb-438f-a684-7161ee45fea2" (UID: "7ecdb20a-5ffb-438f-a684-7161ee45fea2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.652502 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd" (OuterVolumeSpecName: "kube-api-access-2v4xd") pod "7ecdb20a-5ffb-438f-a684-7161ee45fea2" (UID: "7ecdb20a-5ffb-438f-a684-7161ee45fea2"). InnerVolumeSpecName "kube-api-access-2v4xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.656443 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ecdb20a-5ffb-438f-a684-7161ee45fea2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.656475 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecdb20a-5ffb-438f-a684-7161ee45fea2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4912]: I0318 14:15:03.656489 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v4xd\" (UniqueName: \"kubernetes.io/projected/7ecdb20a-5ffb-438f-a684-7161ee45fea2-kube-api-access-2v4xd\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:04 crc kubenswrapper[4912]: I0318 14:15:04.180500 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" event={"ID":"7ecdb20a-5ffb-438f-a684-7161ee45fea2","Type":"ContainerDied","Data":"3a1baf83f698d1f10fd71208465941be52181903f47c61d0c46cdcd51f829a0f"} Mar 18 14:15:04 crc kubenswrapper[4912]: I0318 14:15:04.180567 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a1baf83f698d1f10fd71208465941be52181903f47c61d0c46cdcd51f829a0f" Mar 18 14:15:04 crc kubenswrapper[4912]: I0318 14:15:04.180674 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-x8dqz" Mar 18 14:15:04 crc kubenswrapper[4912]: I0318 14:15:04.709729 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf"] Mar 18 14:15:04 crc kubenswrapper[4912]: I0318 14:15:04.720934 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-nkjcf"] Mar 18 14:15:06 crc kubenswrapper[4912]: I0318 14:15:06.109209 4912 scope.go:117] "RemoveContainer" containerID="f40225a9a8f4201285c97a2a2cdfef917bfb3258ed2723d3811b30e3bb0b6e19" Mar 18 14:15:06 crc kubenswrapper[4912]: I0318 14:15:06.249188 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d9949b-baff-4ef3-8879-da61b30d7b24" path="/var/lib/kubelet/pods/07d9949b-baff-4ef3-8879-da61b30d7b24/volumes" Mar 18 14:15:37 crc kubenswrapper[4912]: I0318 14:15:36.999626 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:15:37 crc kubenswrapper[4912]: I0318 14:15:37.001495 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.158218 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564056-k76gj"] Mar 18 14:16:00 crc kubenswrapper[4912]: E0318 14:16:00.159701 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdb20a-5ffb-438f-a684-7161ee45fea2" containerName="collect-profiles" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.159717 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdb20a-5ffb-438f-a684-7161ee45fea2" containerName="collect-profiles" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.159997 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdb20a-5ffb-438f-a684-7161ee45fea2" containerName="collect-profiles" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.161065 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.163959 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.168373 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.168837 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.176740 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-k76gj"] Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.274226 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqg6\" (UniqueName: \"kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6\") pod \"auto-csr-approver-29564056-k76gj\" (UID: \"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6\") " pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.377781 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqg6\" (UniqueName: \"kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6\") pod \"auto-csr-approver-29564056-k76gj\" (UID: \"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6\") " pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.399769 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqg6\" (UniqueName: \"kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6\") pod \"auto-csr-approver-29564056-k76gj\" (UID: \"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6\") " pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:00 crc kubenswrapper[4912]: I0318 14:16:00.491512 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:01 crc kubenswrapper[4912]: I0318 14:16:01.012657 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-k76gj"] Mar 18 14:16:01 crc kubenswrapper[4912]: W0318 14:16:01.022204 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeffe7aa7_db6e_47c6_97e9_b9ab7afea1b6.slice/crio-20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9 WatchSource:0}: Error finding container 20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9: Status 404 returned error can't find the container with id 20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9 Mar 18 14:16:01 crc kubenswrapper[4912]: I0318 14:16:01.910243 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-k76gj" event={"ID":"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6","Type":"ContainerStarted","Data":"20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9"} Mar 18 14:16:02 crc kubenswrapper[4912]: I0318 14:16:02.926502 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-k76gj" event={"ID":"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6","Type":"ContainerStarted","Data":"3d3047b47a02dbbbb03aa4f3df823985627afd9b1e2793856b7327f89a3a71cc"} Mar 18 14:16:02 crc kubenswrapper[4912]: I0318 14:16:02.974020 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564056-k76gj" podStartSLOduration=1.8067480580000002 podStartE2EDuration="2.973940456s" podCreationTimestamp="2026-03-18 14:16:00 +0000 UTC" firstStartedPulling="2026-03-18 14:16:01.025477687 +0000 UTC m=+4409.484905102" lastFinishedPulling="2026-03-18 14:16:02.192670075 +0000 UTC m=+4410.652097500" observedRunningTime="2026-03-18 14:16:02.952055193 +0000 UTC m=+4411.411482638" watchObservedRunningTime="2026-03-18 14:16:02.973940456 +0000 UTC m=+4411.433367881" Mar 18 14:16:03 crc kubenswrapper[4912]: I0318 14:16:03.943288 4912 generic.go:334] "Generic (PLEG): container finished" podID="effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" containerID="3d3047b47a02dbbbb03aa4f3df823985627afd9b1e2793856b7327f89a3a71cc" exitCode=0 Mar 18 14:16:03 crc kubenswrapper[4912]: I0318 14:16:03.943397 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-k76gj" event={"ID":"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6","Type":"ContainerDied","Data":"3d3047b47a02dbbbb03aa4f3df823985627afd9b1e2793856b7327f89a3a71cc"} Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.463245 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.586201 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxqg6\" (UniqueName: \"kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6\") pod \"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6\" (UID: \"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6\") " Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.600535 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6" (OuterVolumeSpecName: "kube-api-access-qxqg6") pod "effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" (UID: "effe7aa7-db6e-47c6-97e9-b9ab7afea1b6"). InnerVolumeSpecName "kube-api-access-qxqg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.691886 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxqg6\" (UniqueName: \"kubernetes.io/projected/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6-kube-api-access-qxqg6\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.978384 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-k76gj" event={"ID":"effe7aa7-db6e-47c6-97e9-b9ab7afea1b6","Type":"ContainerDied","Data":"20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9"} Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.978433 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a3432b21318283d8ba34d6b5e03b301be04fe2a4405983e2c2168e5308a0c9" Mar 18 14:16:05 crc kubenswrapper[4912]: I0318 14:16:05.978498 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-k76gj" Mar 18 14:16:06 crc kubenswrapper[4912]: I0318 14:16:06.225521 4912 scope.go:117] "RemoveContainer" containerID="c4e2b0d1399d0ac74142743a08b0df0ec8b7bf94c0c981e6e09748d0c7b96f33" Mar 18 14:16:06 crc kubenswrapper[4912]: I0318 14:16:06.557574 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-9srqk"] Mar 18 14:16:06 crc kubenswrapper[4912]: I0318 14:16:06.572738 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-9srqk"] Mar 18 14:16:07 crc kubenswrapper[4912]: I0318 14:16:07.000063 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:16:07 crc kubenswrapper[4912]: I0318 14:16:07.000142 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:16:08 crc kubenswrapper[4912]: I0318 14:16:08.246292 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68102db5-4e1b-4ca8-8fa5-00b054ceebd6" path="/var/lib/kubelet/pods/68102db5-4e1b-4ca8-8fa5-00b054ceebd6/volumes" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.088378 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:16:35 crc kubenswrapper[4912]: E0318 14:16:35.090355 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" containerName="oc" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.090371 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" containerName="oc" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.090693 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" containerName="oc" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.092895 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.100596 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.171110 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txbvh\" (UniqueName: \"kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.171796 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.172199 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.275644 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.275764 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.275940 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txbvh\" (UniqueName: \"kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.276618 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.277354 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.312508 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txbvh\" (UniqueName: \"kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh\") pod \"redhat-operators-plxhb\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.415619 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:35 crc kubenswrapper[4912]: I0318 14:16:35.984457 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:16:36 crc kubenswrapper[4912]: I0318 14:16:36.355894 4912 generic.go:334] "Generic (PLEG): container finished" podID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerID="535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09" exitCode=0 Mar 18 14:16:36 crc kubenswrapper[4912]: I0318 14:16:36.355958 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerDied","Data":"535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09"} Mar 18 14:16:36 crc kubenswrapper[4912]: I0318 14:16:36.355991 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerStarted","Data":"ba3992b2fa996a6743797c467c2a22781cdbd2c4881b7d618cfae82e6ecc3b74"} Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.021885 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.022503 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.022580 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.024461 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.024549 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d" gracePeriod=600 Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.370830 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d" exitCode=0 Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.370901 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d"} Mar 18 14:16:37 crc kubenswrapper[4912]: I0318 14:16:37.371342 4912 scope.go:117] "RemoveContainer" containerID="78205da000da84178f698329864686fddb0dfb7a8e19568763f3a2bddd4165d3" Mar 18 14:16:38 crc kubenswrapper[4912]: I0318 14:16:38.389643 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a"} Mar 18 14:16:38 crc kubenswrapper[4912]: I0318 14:16:38.394931 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerStarted","Data":"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984"} Mar 18 14:16:40 crc kubenswrapper[4912]: E0318 14:16:40.565104 4912 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.143:41680->38.102.83.143:33015: read tcp 38.102.83.143:41680->38.102.83.143:33015: read: connection reset by peer Mar 18 14:16:40 crc kubenswrapper[4912]: E0318 14:16:40.566107 4912 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.143:41680->38.102.83.143:33015: write tcp 38.102.83.143:41680->38.102.83.143:33015: write: broken pipe Mar 18 14:16:43 crc kubenswrapper[4912]: I0318 14:16:43.461151 4912 generic.go:334] "Generic (PLEG): container finished" podID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerID="3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984" exitCode=0 Mar 18 14:16:43 crc kubenswrapper[4912]: I0318 14:16:43.461232 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerDied","Data":"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984"} Mar 18 14:16:44 crc kubenswrapper[4912]: I0318 14:16:44.476470 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerStarted","Data":"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7"} Mar 18 14:16:44 crc kubenswrapper[4912]: I0318 14:16:44.501441 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-plxhb" podStartSLOduration=1.715738296 podStartE2EDuration="9.501414324s" podCreationTimestamp="2026-03-18 14:16:35 +0000 UTC" firstStartedPulling="2026-03-18 14:16:36.358206229 +0000 UTC m=+4444.817633654" lastFinishedPulling="2026-03-18 14:16:44.143882257 +0000 UTC m=+4452.603309682" observedRunningTime="2026-03-18 14:16:44.4990278 +0000 UTC m=+4452.958455245" watchObservedRunningTime="2026-03-18 14:16:44.501414324 +0000 UTC m=+4452.960841749" Mar 18 14:16:45 crc kubenswrapper[4912]: I0318 14:16:45.417592 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:45 crc kubenswrapper[4912]: I0318 14:16:45.418133 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:16:46 crc kubenswrapper[4912]: I0318 14:16:46.504211 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-plxhb" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" probeResult="failure" output=< Mar 18 14:16:46 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:16:46 crc kubenswrapper[4912]: > Mar 18 14:16:56 crc kubenswrapper[4912]: I0318 14:16:56.474653 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-plxhb" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" probeResult="failure" output=< Mar 18 14:16:56 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:16:56 crc kubenswrapper[4912]: > Mar 18 14:17:06 crc kubenswrapper[4912]: I0318 14:17:06.330914 4912 scope.go:117] "RemoveContainer" containerID="d8eb327b797ab247c84e3e87bd5a6e411dcf1449d397926adfc029d2f3ebaba2" Mar 18 14:17:06 crc kubenswrapper[4912]: I0318 14:17:06.482336 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-plxhb" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" probeResult="failure" output=< Mar 18 14:17:06 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:17:06 crc kubenswrapper[4912]: > Mar 18 14:17:15 crc kubenswrapper[4912]: I0318 14:17:15.988336 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:17:16 crc kubenswrapper[4912]: I0318 14:17:16.053357 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:17:16 crc kubenswrapper[4912]: I0318 14:17:16.260941 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:17:17 crc kubenswrapper[4912]: I0318 14:17:17.915136 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-plxhb" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" containerID="cri-o://2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7" gracePeriod=2 Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.699797 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.787901 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities\") pod \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.788140 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txbvh\" (UniqueName: \"kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh\") pod \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.788421 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content\") pod \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\" (UID: \"64ec8921-9f2e-4f93-93a4-ff3347e91a07\") " Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.790451 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities" (OuterVolumeSpecName: "utilities") pod "64ec8921-9f2e-4f93-93a4-ff3347e91a07" (UID: "64ec8921-9f2e-4f93-93a4-ff3347e91a07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.802365 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh" (OuterVolumeSpecName: "kube-api-access-txbvh") pod "64ec8921-9f2e-4f93-93a4-ff3347e91a07" (UID: "64ec8921-9f2e-4f93-93a4-ff3347e91a07"). InnerVolumeSpecName "kube-api-access-txbvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.891737 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.891781 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txbvh\" (UniqueName: \"kubernetes.io/projected/64ec8921-9f2e-4f93-93a4-ff3347e91a07-kube-api-access-txbvh\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.928854 4912 generic.go:334] "Generic (PLEG): container finished" podID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerID="2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7" exitCode=0 Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.928928 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerDied","Data":"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7"} Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.928977 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plxhb" event={"ID":"64ec8921-9f2e-4f93-93a4-ff3347e91a07","Type":"ContainerDied","Data":"ba3992b2fa996a6743797c467c2a22781cdbd2c4881b7d618cfae82e6ecc3b74"} Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.928999 4912 scope.go:117] "RemoveContainer" containerID="2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.929261 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plxhb" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.945764 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64ec8921-9f2e-4f93-93a4-ff3347e91a07" (UID: "64ec8921-9f2e-4f93-93a4-ff3347e91a07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.963636 4912 scope.go:117] "RemoveContainer" containerID="3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984" Mar 18 14:17:18 crc kubenswrapper[4912]: I0318 14:17:18.995593 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec8921-9f2e-4f93-93a4-ff3347e91a07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.014197 4912 scope.go:117] "RemoveContainer" containerID="535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.067739 4912 scope.go:117] "RemoveContainer" containerID="2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7" Mar 18 14:17:19 crc kubenswrapper[4912]: E0318 14:17:19.068635 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7\": container with ID starting with 2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7 not found: ID does not exist" containerID="2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.068713 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7"} err="failed to get container status \"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7\": rpc error: code = NotFound desc = could not find container \"2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7\": container with ID starting with 2d43417f573e231a0733d89c8ee969a63a2faf23a2840f6c064e5fcd574dcae7 not found: ID does not exist" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.068767 4912 scope.go:117] "RemoveContainer" containerID="3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984" Mar 18 14:17:19 crc kubenswrapper[4912]: E0318 14:17:19.069289 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984\": container with ID starting with 3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984 not found: ID does not exist" containerID="3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.069366 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984"} err="failed to get container status \"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984\": rpc error: code = NotFound desc = could not find container \"3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984\": container with ID starting with 3e16ddad3d42a6783aaf76779d767c61689bf72fbe413c127fa79ef6f411a984 not found: ID does not exist" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.069408 4912 scope.go:117] "RemoveContainer" containerID="535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09" Mar 18 14:17:19 crc kubenswrapper[4912]: E0318 14:17:19.069917 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09\": container with ID starting with 535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09 not found: ID does not exist" containerID="535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.069979 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09"} err="failed to get container status \"535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09\": rpc error: code = NotFound desc = could not find container \"535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09\": container with ID starting with 535cbdd72e2124d583226112c1ed211814fabb441c653f77be59c538cbc13b09 not found: ID does not exist" Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.281532 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:17:19 crc kubenswrapper[4912]: I0318 14:17:19.305541 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-plxhb"] Mar 18 14:17:20 crc kubenswrapper[4912]: I0318 14:17:20.260204 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" path="/var/lib/kubelet/pods/64ec8921-9f2e-4f93-93a4-ff3347e91a07/volumes" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.160203 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564058-srdzd"] Mar 18 14:18:00 crc kubenswrapper[4912]: E0318 14:18:00.161649 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="extract-utilities" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.161673 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="extract-utilities" Mar 18 14:18:00 crc kubenswrapper[4912]: E0318 14:18:00.161707 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="extract-content" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.161716 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="extract-content" Mar 18 14:18:00 crc kubenswrapper[4912]: E0318 14:18:00.161755 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.161764 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.162189 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ec8921-9f2e-4f93-93a4-ff3347e91a07" containerName="registry-server" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.163593 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.167753 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.172575 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.173958 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.176697 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-srdzd"] Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.231002 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvxq\" (UniqueName: \"kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq\") pod \"auto-csr-approver-29564058-srdzd\" (UID: \"6a07a877-c409-4ec5-a89b-526735d5c1a7\") " pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.347549 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvxq\" (UniqueName: \"kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq\") pod \"auto-csr-approver-29564058-srdzd\" (UID: \"6a07a877-c409-4ec5-a89b-526735d5c1a7\") " pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.369382 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvxq\" (UniqueName: \"kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq\") pod \"auto-csr-approver-29564058-srdzd\" (UID: \"6a07a877-c409-4ec5-a89b-526735d5c1a7\") " pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:00 crc kubenswrapper[4912]: I0318 14:18:00.492379 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:01 crc kubenswrapper[4912]: I0318 14:18:01.038309 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:18:01 crc kubenswrapper[4912]: I0318 14:18:01.039321 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-srdzd"] Mar 18 14:18:01 crc kubenswrapper[4912]: I0318 14:18:01.466976 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-srdzd" event={"ID":"6a07a877-c409-4ec5-a89b-526735d5c1a7","Type":"ContainerStarted","Data":"03650c524ab8c0f0cbc6f613a8ed56d7c75de402976afc55588119cfc65f1834"} Mar 18 14:18:03 crc kubenswrapper[4912]: I0318 14:18:03.496548 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-srdzd" event={"ID":"6a07a877-c409-4ec5-a89b-526735d5c1a7","Type":"ContainerStarted","Data":"5ad728797ae56c6e122505dcd6466eca70b82e4da1f03068956a8c73de12df29"} Mar 18 14:18:03 crc kubenswrapper[4912]: I0318 14:18:03.530272 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564058-srdzd" podStartSLOduration=2.231874059 podStartE2EDuration="3.530225745s" podCreationTimestamp="2026-03-18 14:18:00 +0000 UTC" firstStartedPulling="2026-03-18 14:18:01.038053488 +0000 UTC m=+4529.497480923" lastFinishedPulling="2026-03-18 14:18:02.336405184 +0000 UTC m=+4530.795832609" observedRunningTime="2026-03-18 14:18:03.520370108 +0000 UTC m=+4531.979797543" watchObservedRunningTime="2026-03-18 14:18:03.530225745 +0000 UTC m=+4531.989653170" Mar 18 14:18:04 crc kubenswrapper[4912]: I0318 14:18:04.510596 4912 generic.go:334] "Generic (PLEG): container finished" podID="6a07a877-c409-4ec5-a89b-526735d5c1a7" containerID="5ad728797ae56c6e122505dcd6466eca70b82e4da1f03068956a8c73de12df29" exitCode=0 Mar 18 14:18:04 crc kubenswrapper[4912]: I0318 14:18:04.510697 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-srdzd" event={"ID":"6a07a877-c409-4ec5-a89b-526735d5c1a7","Type":"ContainerDied","Data":"5ad728797ae56c6e122505dcd6466eca70b82e4da1f03068956a8c73de12df29"} Mar 18 14:18:05 crc kubenswrapper[4912]: I0318 14:18:05.966119 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.138404 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvxq\" (UniqueName: \"kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq\") pod \"6a07a877-c409-4ec5-a89b-526735d5c1a7\" (UID: \"6a07a877-c409-4ec5-a89b-526735d5c1a7\") " Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.146333 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq" (OuterVolumeSpecName: "kube-api-access-qfvxq") pod "6a07a877-c409-4ec5-a89b-526735d5c1a7" (UID: "6a07a877-c409-4ec5-a89b-526735d5c1a7"). InnerVolumeSpecName "kube-api-access-qfvxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.243034 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvxq\" (UniqueName: \"kubernetes.io/projected/6a07a877-c409-4ec5-a89b-526735d5c1a7-kube-api-access-qfvxq\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.547512 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-srdzd" event={"ID":"6a07a877-c409-4ec5-a89b-526735d5c1a7","Type":"ContainerDied","Data":"03650c524ab8c0f0cbc6f613a8ed56d7c75de402976afc55588119cfc65f1834"} Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.547596 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03650c524ab8c0f0cbc6f613a8ed56d7c75de402976afc55588119cfc65f1834" Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.547554 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-srdzd" Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.611383 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-8ljkx"] Mar 18 14:18:06 crc kubenswrapper[4912]: I0318 14:18:06.627829 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-8ljkx"] Mar 18 14:18:08 crc kubenswrapper[4912]: I0318 14:18:08.243434 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15617ee9-18d6-42e1-b207-cbbfc9f938ca" path="/var/lib/kubelet/pods/15617ee9-18d6-42e1-b207-cbbfc9f938ca/volumes" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.178894 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:09 crc kubenswrapper[4912]: E0318 14:18:09.180212 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a07a877-c409-4ec5-a89b-526735d5c1a7" containerName="oc" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.180239 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a07a877-c409-4ec5-a89b-526735d5c1a7" containerName="oc" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.180663 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a07a877-c409-4ec5-a89b-526735d5c1a7" containerName="oc" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.185350 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.200553 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.241402 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.244256 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.244369 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6vhv\" (UniqueName: \"kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.353760 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.354161 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.354265 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6vhv\" (UniqueName: \"kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.356971 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.357068 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.417670 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6vhv\" (UniqueName: \"kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv\") pod \"redhat-marketplace-bn8d7\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:09 crc kubenswrapper[4912]: I0318 14:18:09.513293 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:10 crc kubenswrapper[4912]: I0318 14:18:10.066408 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:10 crc kubenswrapper[4912]: I0318 14:18:10.606762 4912 generic.go:334] "Generic (PLEG): container finished" podID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerID="72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501" exitCode=0 Mar 18 14:18:10 crc kubenswrapper[4912]: I0318 14:18:10.606961 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerDied","Data":"72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501"} Mar 18 14:18:10 crc kubenswrapper[4912]: I0318 14:18:10.607155 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerStarted","Data":"c37d6399b0876fd5647a4c4238875ebe468f830ebd876f48b8a8cf797137ae2f"} Mar 18 14:18:12 crc kubenswrapper[4912]: I0318 14:18:12.632629 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerStarted","Data":"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574"} Mar 18 14:18:14 crc kubenswrapper[4912]: I0318 14:18:14.659102 4912 generic.go:334] "Generic (PLEG): container finished" podID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerID="42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574" exitCode=0 Mar 18 14:18:14 crc kubenswrapper[4912]: I0318 14:18:14.659426 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerDied","Data":"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574"} Mar 18 14:18:15 crc kubenswrapper[4912]: I0318 14:18:15.674902 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerStarted","Data":"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf"} Mar 18 14:18:15 crc kubenswrapper[4912]: I0318 14:18:15.738744 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bn8d7" podStartSLOduration=2.165359009 podStartE2EDuration="6.738718097s" podCreationTimestamp="2026-03-18 14:18:09 +0000 UTC" firstStartedPulling="2026-03-18 14:18:10.6094116 +0000 UTC m=+4539.068839025" lastFinishedPulling="2026-03-18 14:18:15.182770688 +0000 UTC m=+4543.642198113" observedRunningTime="2026-03-18 14:18:15.69825169 +0000 UTC m=+4544.157679145" watchObservedRunningTime="2026-03-18 14:18:15.738718097 +0000 UTC m=+4544.198145522" Mar 18 14:18:18 crc kubenswrapper[4912]: E0318 14:18:18.344945 4912 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.143:40260->38.102.83.143:33015: write tcp 38.102.83.143:40260->38.102.83.143:33015: write: broken pipe Mar 18 14:18:19 crc kubenswrapper[4912]: I0318 14:18:19.514151 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:19 crc kubenswrapper[4912]: I0318 14:18:19.514715 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:19 crc kubenswrapper[4912]: I0318 14:18:19.569133 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:29 crc kubenswrapper[4912]: I0318 14:18:29.570880 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:29 crc kubenswrapper[4912]: I0318 14:18:29.648572 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:29 crc kubenswrapper[4912]: I0318 14:18:29.852903 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bn8d7" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="registry-server" containerID="cri-o://b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf" gracePeriod=2 Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.501755 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.594366 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities\") pod \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.594567 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6vhv\" (UniqueName: \"kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv\") pod \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.594672 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content\") pod \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\" (UID: \"13741a01-ac7a-4ec0-8ada-a78da43bb9a0\") " Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.609961 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities" (OuterVolumeSpecName: "utilities") pod "13741a01-ac7a-4ec0-8ada-a78da43bb9a0" (UID: "13741a01-ac7a-4ec0-8ada-a78da43bb9a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.619365 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv" (OuterVolumeSpecName: "kube-api-access-d6vhv") pod "13741a01-ac7a-4ec0-8ada-a78da43bb9a0" (UID: "13741a01-ac7a-4ec0-8ada-a78da43bb9a0"). InnerVolumeSpecName "kube-api-access-d6vhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.621808 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13741a01-ac7a-4ec0-8ada-a78da43bb9a0" (UID: "13741a01-ac7a-4ec0-8ada-a78da43bb9a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.698358 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6vhv\" (UniqueName: \"kubernetes.io/projected/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-kube-api-access-d6vhv\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.698400 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.698411 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13741a01-ac7a-4ec0-8ada-a78da43bb9a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.866713 4912 generic.go:334] "Generic (PLEG): container finished" podID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerID="b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf" exitCode=0 Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.866767 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerDied","Data":"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf"} Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.866779 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn8d7" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.866800 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn8d7" event={"ID":"13741a01-ac7a-4ec0-8ada-a78da43bb9a0","Type":"ContainerDied","Data":"c37d6399b0876fd5647a4c4238875ebe468f830ebd876f48b8a8cf797137ae2f"} Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.866819 4912 scope.go:117] "RemoveContainer" containerID="b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.895324 4912 scope.go:117] "RemoveContainer" containerID="42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574" Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.925149 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:30 crc kubenswrapper[4912]: I0318 14:18:30.937300 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn8d7"] Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.751499 4912 scope.go:117] "RemoveContainer" containerID="72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.829732 4912 scope.go:117] "RemoveContainer" containerID="b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf" Mar 18 14:18:31 crc kubenswrapper[4912]: E0318 14:18:31.830411 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf\": container with ID starting with b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf not found: ID does not exist" containerID="b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.830480 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf"} err="failed to get container status \"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf\": rpc error: code = NotFound desc = could not find container \"b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf\": container with ID starting with b0c0211548062bc1aefdcbcf07aeaef60e1750e88323d6b0eba2fb2face342bf not found: ID does not exist" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.830520 4912 scope.go:117] "RemoveContainer" containerID="42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574" Mar 18 14:18:31 crc kubenswrapper[4912]: E0318 14:18:31.831097 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574\": container with ID starting with 42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574 not found: ID does not exist" containerID="42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.831146 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574"} err="failed to get container status \"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574\": rpc error: code = NotFound desc = could not find container \"42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574\": container with ID starting with 42f8fe8b3723c6286feb8e79186b9f20c1672c91689096442a9aaa66bcf88574 not found: ID does not exist" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.831177 4912 scope.go:117] "RemoveContainer" containerID="72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501" Mar 18 14:18:31 crc kubenswrapper[4912]: E0318 14:18:31.831685 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501\": container with ID starting with 72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501 not found: ID does not exist" containerID="72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501" Mar 18 14:18:31 crc kubenswrapper[4912]: I0318 14:18:31.831723 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501"} err="failed to get container status \"72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501\": rpc error: code = NotFound desc = could not find container \"72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501\": container with ID starting with 72c90c89ef54bffe232d94b7bc4cd88240e2119b068a8fd6033b5987f57cc501 not found: ID does not exist" Mar 18 14:18:32 crc kubenswrapper[4912]: I0318 14:18:32.252634 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" path="/var/lib/kubelet/pods/13741a01-ac7a-4ec0-8ada-a78da43bb9a0/volumes" Mar 18 14:18:58 crc kubenswrapper[4912]: I0318 14:18:58.991025 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6f59b977c9-rwwx4" podUID="08a4effe-9a7e-449c-aba4-74d4b7a4f0ae" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 14:19:06 crc kubenswrapper[4912]: I0318 14:19:06.572005 4912 scope.go:117] "RemoveContainer" containerID="0005edb35531e76c6fac29a3b3a653614f1cd9151f38c59af8346c51f5b72747" Mar 18 14:19:06 crc kubenswrapper[4912]: I0318 14:19:06.998438 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:19:06 crc kubenswrapper[4912]: I0318 14:19:06.998919 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:19:36 crc kubenswrapper[4912]: I0318 14:19:36.998700 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:19:37 crc kubenswrapper[4912]: I0318 14:19:36.999441 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.159677 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564060-bklcg"] Mar 18 14:20:00 crc kubenswrapper[4912]: E0318 14:20:00.160945 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="extract-content" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.160960 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="extract-content" Mar 18 14:20:00 crc kubenswrapper[4912]: E0318 14:20:00.160995 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="registry-server" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.161002 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="registry-server" Mar 18 14:20:00 crc kubenswrapper[4912]: E0318 14:20:00.161051 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="extract-utilities" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.161059 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="extract-utilities" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.161309 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="13741a01-ac7a-4ec0-8ada-a78da43bb9a0" containerName="registry-server" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.162300 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.164992 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.165195 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.165237 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.174311 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-bklcg"] Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.256582 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtf95\" (UniqueName: \"kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95\") pod \"auto-csr-approver-29564060-bklcg\" (UID: \"92ade81f-82a5-4b11-9aab-09c3c06cbcb0\") " pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.359414 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtf95\" (UniqueName: \"kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95\") pod \"auto-csr-approver-29564060-bklcg\" (UID: \"92ade81f-82a5-4b11-9aab-09c3c06cbcb0\") " pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.398640 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtf95\" (UniqueName: \"kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95\") pod \"auto-csr-approver-29564060-bklcg\" (UID: \"92ade81f-82a5-4b11-9aab-09c3c06cbcb0\") " pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:00 crc kubenswrapper[4912]: I0318 14:20:00.497510 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:01 crc kubenswrapper[4912]: I0318 14:20:01.009560 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-bklcg"] Mar 18 14:20:01 crc kubenswrapper[4912]: I0318 14:20:01.045752 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-bklcg" event={"ID":"92ade81f-82a5-4b11-9aab-09c3c06cbcb0","Type":"ContainerStarted","Data":"05608fdcd6d154e62a3018c0dc21f4bd76a4594469d4da1ef7326c1c8c9deb96"} Mar 18 14:20:04 crc kubenswrapper[4912]: I0318 14:20:04.083090 4912 generic.go:334] "Generic (PLEG): container finished" podID="92ade81f-82a5-4b11-9aab-09c3c06cbcb0" containerID="231f4e711a059e0ade1102d29e6c83307ef7a8431790799d2544bccc92c89808" exitCode=0 Mar 18 14:20:04 crc kubenswrapper[4912]: I0318 14:20:04.083618 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-bklcg" event={"ID":"92ade81f-82a5-4b11-9aab-09c3c06cbcb0","Type":"ContainerDied","Data":"231f4e711a059e0ade1102d29e6c83307ef7a8431790799d2544bccc92c89808"} Mar 18 14:20:05 crc kubenswrapper[4912]: I0318 14:20:05.556209 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:05 crc kubenswrapper[4912]: I0318 14:20:05.639644 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtf95\" (UniqueName: \"kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95\") pod \"92ade81f-82a5-4b11-9aab-09c3c06cbcb0\" (UID: \"92ade81f-82a5-4b11-9aab-09c3c06cbcb0\") " Mar 18 14:20:05 crc kubenswrapper[4912]: I0318 14:20:05.647523 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95" (OuterVolumeSpecName: "kube-api-access-dtf95") pod "92ade81f-82a5-4b11-9aab-09c3c06cbcb0" (UID: "92ade81f-82a5-4b11-9aab-09c3c06cbcb0"). InnerVolumeSpecName "kube-api-access-dtf95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:05 crc kubenswrapper[4912]: I0318 14:20:05.743848 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtf95\" (UniqueName: \"kubernetes.io/projected/92ade81f-82a5-4b11-9aab-09c3c06cbcb0-kube-api-access-dtf95\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.124016 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-bklcg" event={"ID":"92ade81f-82a5-4b11-9aab-09c3c06cbcb0","Type":"ContainerDied","Data":"05608fdcd6d154e62a3018c0dc21f4bd76a4594469d4da1ef7326c1c8c9deb96"} Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.124363 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05608fdcd6d154e62a3018c0dc21f4bd76a4594469d4da1ef7326c1c8c9deb96" Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.124226 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-bklcg" Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.643373 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-cbdc8"] Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.658150 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-cbdc8"] Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.998837 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.998904 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:20:06 crc kubenswrapper[4912]: I0318 14:20:06.998960 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.000299 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.000375 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" gracePeriod=600 Mar 18 14:20:07 crc kubenswrapper[4912]: E0318 14:20:07.124250 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.145342 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" exitCode=0 Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.145400 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a"} Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.145458 4912 scope.go:117] "RemoveContainer" containerID="6da300afdcb0ef55d1971d4474b7702217d071cc871cb9db8ddd6d8194adf14d" Mar 18 14:20:07 crc kubenswrapper[4912]: I0318 14:20:07.146622 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:20:07 crc kubenswrapper[4912]: E0318 14:20:07.147057 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:20:08 crc kubenswrapper[4912]: I0318 14:20:08.249767 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6b52b0-2b67-4262-b045-3cb58a8c9cda" path="/var/lib/kubelet/pods/7e6b52b0-2b67-4262-b045-3cb58a8c9cda/volumes" Mar 18 14:20:20 crc kubenswrapper[4912]: I0318 14:20:20.231269 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:20:20 crc kubenswrapper[4912]: E0318 14:20:20.233444 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:20:35 crc kubenswrapper[4912]: I0318 14:20:35.228174 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:20:35 crc kubenswrapper[4912]: E0318 14:20:35.229297 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:20:48 crc kubenswrapper[4912]: I0318 14:20:48.228325 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:20:48 crc kubenswrapper[4912]: E0318 14:20:48.229319 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.527844 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:20:55 crc kubenswrapper[4912]: E0318 14:20:55.529502 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ade81f-82a5-4b11-9aab-09c3c06cbcb0" containerName="oc" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.529523 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ade81f-82a5-4b11-9aab-09c3c06cbcb0" containerName="oc" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.529845 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ade81f-82a5-4b11-9aab-09c3c06cbcb0" containerName="oc" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.532468 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.558180 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.593973 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.594025 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48lw\" (UniqueName: \"kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.594737 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.700130 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.700184 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48lw\" (UniqueName: \"kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.700333 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.700973 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.701146 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.733720 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48lw\" (UniqueName: \"kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw\") pod \"community-operators-skgtv\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:55 crc kubenswrapper[4912]: I0318 14:20:55.903430 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:20:56 crc kubenswrapper[4912]: I0318 14:20:56.912757 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:20:57 crc kubenswrapper[4912]: I0318 14:20:57.768752 4912 generic.go:334] "Generic (PLEG): container finished" podID="b767d355-74ac-4008-92cb-012a39287f57" containerID="e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f" exitCode=0 Mar 18 14:20:57 crc kubenswrapper[4912]: I0318 14:20:57.768876 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerDied","Data":"e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f"} Mar 18 14:20:57 crc kubenswrapper[4912]: I0318 14:20:57.769484 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerStarted","Data":"f82e53298741d3589ec0e8de26988aa87342c8610dc0ed6ad8253b50b27090b5"} Mar 18 14:20:59 crc kubenswrapper[4912]: I0318 14:20:59.802792 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerStarted","Data":"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c"} Mar 18 14:21:00 crc kubenswrapper[4912]: I0318 14:21:00.228854 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:21:00 crc kubenswrapper[4912]: E0318 14:21:00.229580 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:21:00 crc kubenswrapper[4912]: I0318 14:21:00.814695 4912 generic.go:334] "Generic (PLEG): container finished" podID="b767d355-74ac-4008-92cb-012a39287f57" containerID="befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c" exitCode=0 Mar 18 14:21:00 crc kubenswrapper[4912]: I0318 14:21:00.814743 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerDied","Data":"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c"} Mar 18 14:21:01 crc kubenswrapper[4912]: I0318 14:21:01.831728 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerStarted","Data":"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b"} Mar 18 14:21:01 crc kubenswrapper[4912]: I0318 14:21:01.873507 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-skgtv" podStartSLOduration=3.328753099 podStartE2EDuration="6.873476435s" podCreationTimestamp="2026-03-18 14:20:55 +0000 UTC" firstStartedPulling="2026-03-18 14:20:57.77869473 +0000 UTC m=+4706.238122175" lastFinishedPulling="2026-03-18 14:21:01.323418086 +0000 UTC m=+4709.782845511" observedRunningTime="2026-03-18 14:21:01.856164776 +0000 UTC m=+4710.315592211" watchObservedRunningTime="2026-03-18 14:21:01.873476435 +0000 UTC m=+4710.332903870" Mar 18 14:21:05 crc kubenswrapper[4912]: I0318 14:21:05.904766 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:05 crc kubenswrapper[4912]: I0318 14:21:05.906192 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:06 crc kubenswrapper[4912]: I0318 14:21:06.382699 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:06 crc kubenswrapper[4912]: I0318 14:21:06.725177 4912 scope.go:117] "RemoveContainer" containerID="83a52a16289a3cdef81a88a6f3a89057be675f747d9df87b4092a8bd128fbce9" Mar 18 14:21:06 crc kubenswrapper[4912]: I0318 14:21:06.947113 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:07 crc kubenswrapper[4912]: I0318 14:21:07.019075 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:21:08 crc kubenswrapper[4912]: I0318 14:21:08.925847 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-skgtv" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="registry-server" containerID="cri-o://40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b" gracePeriod=2 Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.541467 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.736622 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities\") pod \"b767d355-74ac-4008-92cb-012a39287f57\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.736840 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q48lw\" (UniqueName: \"kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw\") pod \"b767d355-74ac-4008-92cb-012a39287f57\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.737016 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content\") pod \"b767d355-74ac-4008-92cb-012a39287f57\" (UID: \"b767d355-74ac-4008-92cb-012a39287f57\") " Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.738104 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities" (OuterVolumeSpecName: "utilities") pod "b767d355-74ac-4008-92cb-012a39287f57" (UID: "b767d355-74ac-4008-92cb-012a39287f57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.744946 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw" (OuterVolumeSpecName: "kube-api-access-q48lw") pod "b767d355-74ac-4008-92cb-012a39287f57" (UID: "b767d355-74ac-4008-92cb-012a39287f57"). InnerVolumeSpecName "kube-api-access-q48lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.798702 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b767d355-74ac-4008-92cb-012a39287f57" (UID: "b767d355-74ac-4008-92cb-012a39287f57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.840634 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q48lw\" (UniqueName: \"kubernetes.io/projected/b767d355-74ac-4008-92cb-012a39287f57-kube-api-access-q48lw\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.840686 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.840696 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b767d355-74ac-4008-92cb-012a39287f57-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.940858 4912 generic.go:334] "Generic (PLEG): container finished" podID="b767d355-74ac-4008-92cb-012a39287f57" containerID="40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b" exitCode=0 Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.940930 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerDied","Data":"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b"} Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.940968 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skgtv" event={"ID":"b767d355-74ac-4008-92cb-012a39287f57","Type":"ContainerDied","Data":"f82e53298741d3589ec0e8de26988aa87342c8610dc0ed6ad8253b50b27090b5"} Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.940994 4912 scope.go:117] "RemoveContainer" containerID="40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.941215 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skgtv" Mar 18 14:21:09 crc kubenswrapper[4912]: I0318 14:21:09.992282 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.005364 4912 scope.go:117] "RemoveContainer" containerID="befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.010428 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-skgtv"] Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.034655 4912 scope.go:117] "RemoveContainer" containerID="e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.089214 4912 scope.go:117] "RemoveContainer" containerID="40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b" Mar 18 14:21:10 crc kubenswrapper[4912]: E0318 14:21:10.089882 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b\": container with ID starting with 40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b not found: ID does not exist" containerID="40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.089951 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b"} err="failed to get container status \"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b\": rpc error: code = NotFound desc = could not find container \"40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b\": container with ID starting with 40a282ccf9fff6b8a0b8c7feee84ef8f920ccddb4eef5b8e99f694d4fa4d721b not found: ID does not exist" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.090014 4912 scope.go:117] "RemoveContainer" containerID="befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c" Mar 18 14:21:10 crc kubenswrapper[4912]: E0318 14:21:10.090589 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c\": container with ID starting with befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c not found: ID does not exist" containerID="befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.090636 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c"} err="failed to get container status \"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c\": rpc error: code = NotFound desc = could not find container \"befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c\": container with ID starting with befb79bd2bc6a098c76680e266685fe30842b13e3c950b981737ab604042217c not found: ID does not exist" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.090661 4912 scope.go:117] "RemoveContainer" containerID="e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f" Mar 18 14:21:10 crc kubenswrapper[4912]: E0318 14:21:10.091109 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f\": container with ID starting with e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f not found: ID does not exist" containerID="e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.091166 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f"} err="failed to get container status \"e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f\": rpc error: code = NotFound desc = could not find container \"e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f\": container with ID starting with e8d6e879eb124033f49ff6cc10131a0078b5a1d9b8f6ba3aa3af9eb14406561f not found: ID does not exist" Mar 18 14:21:10 crc kubenswrapper[4912]: I0318 14:21:10.242564 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b767d355-74ac-4008-92cb-012a39287f57" path="/var/lib/kubelet/pods/b767d355-74ac-4008-92cb-012a39287f57/volumes" Mar 18 14:21:11 crc kubenswrapper[4912]: I0318 14:21:11.227955 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:21:11 crc kubenswrapper[4912]: E0318 14:21:11.228696 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:21:26 crc kubenswrapper[4912]: I0318 14:21:26.228883 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:21:26 crc kubenswrapper[4912]: E0318 14:21:26.229880 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:21:38 crc kubenswrapper[4912]: I0318 14:21:38.229109 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:21:38 crc kubenswrapper[4912]: E0318 14:21:38.229926 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:21:53 crc kubenswrapper[4912]: I0318 14:21:53.228357 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:21:53 crc kubenswrapper[4912]: E0318 14:21:53.229340 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.166305 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564062-vrtbg"] Mar 18 14:22:00 crc kubenswrapper[4912]: E0318 14:22:00.167788 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="extract-utilities" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.167806 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="extract-utilities" Mar 18 14:22:00 crc kubenswrapper[4912]: E0318 14:22:00.167822 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="extract-content" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.167831 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="extract-content" Mar 18 14:22:00 crc kubenswrapper[4912]: E0318 14:22:00.167876 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="registry-server" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.167884 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="registry-server" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.168165 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b767d355-74ac-4008-92cb-012a39287f57" containerName="registry-server" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.169223 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.172183 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.172191 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.172300 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.181916 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-vrtbg"] Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.336383 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvvh\" (UniqueName: \"kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh\") pod \"auto-csr-approver-29564062-vrtbg\" (UID: \"56a2eef2-104c-46be-ae21-de8f5817f558\") " pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.440472 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvvh\" (UniqueName: \"kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh\") pod \"auto-csr-approver-29564062-vrtbg\" (UID: \"56a2eef2-104c-46be-ae21-de8f5817f558\") " pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.464703 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvvh\" (UniqueName: \"kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh\") pod \"auto-csr-approver-29564062-vrtbg\" (UID: \"56a2eef2-104c-46be-ae21-de8f5817f558\") " pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:00 crc kubenswrapper[4912]: I0318 14:22:00.496010 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:01 crc kubenswrapper[4912]: I0318 14:22:01.147689 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-vrtbg"] Mar 18 14:22:01 crc kubenswrapper[4912]: I0318 14:22:01.616365 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" event={"ID":"56a2eef2-104c-46be-ae21-de8f5817f558","Type":"ContainerStarted","Data":"d6c8495766c06d17947ddd7444def2b6b45bf9856b57a2013828d5c3c31389d2"} Mar 18 14:22:03 crc kubenswrapper[4912]: I0318 14:22:03.652850 4912 generic.go:334] "Generic (PLEG): container finished" podID="56a2eef2-104c-46be-ae21-de8f5817f558" containerID="6c0b5a8916f0a127d2dc34d512d26d5d805a12d95c18bb74bef03414ef89cd37" exitCode=0 Mar 18 14:22:03 crc kubenswrapper[4912]: I0318 14:22:03.652956 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" event={"ID":"56a2eef2-104c-46be-ae21-de8f5817f558","Type":"ContainerDied","Data":"6c0b5a8916f0a127d2dc34d512d26d5d805a12d95c18bb74bef03414ef89cd37"} Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.095738 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.186227 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksvvh\" (UniqueName: \"kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh\") pod \"56a2eef2-104c-46be-ae21-de8f5817f558\" (UID: \"56a2eef2-104c-46be-ae21-de8f5817f558\") " Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.195231 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh" (OuterVolumeSpecName: "kube-api-access-ksvvh") pod "56a2eef2-104c-46be-ae21-de8f5817f558" (UID: "56a2eef2-104c-46be-ae21-de8f5817f558"). InnerVolumeSpecName "kube-api-access-ksvvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.228612 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:22:05 crc kubenswrapper[4912]: E0318 14:22:05.229247 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.293174 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksvvh\" (UniqueName: \"kubernetes.io/projected/56a2eef2-104c-46be-ae21-de8f5817f558-kube-api-access-ksvvh\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.682252 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" event={"ID":"56a2eef2-104c-46be-ae21-de8f5817f558","Type":"ContainerDied","Data":"d6c8495766c06d17947ddd7444def2b6b45bf9856b57a2013828d5c3c31389d2"} Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.682617 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c8495766c06d17947ddd7444def2b6b45bf9856b57a2013828d5c3c31389d2" Mar 18 14:22:05 crc kubenswrapper[4912]: I0318 14:22:05.682376 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-vrtbg" Mar 18 14:22:06 crc kubenswrapper[4912]: I0318 14:22:06.194578 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-k76gj"] Mar 18 14:22:06 crc kubenswrapper[4912]: I0318 14:22:06.211834 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-k76gj"] Mar 18 14:22:06 crc kubenswrapper[4912]: I0318 14:22:06.244741 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effe7aa7-db6e-47c6-97e9-b9ab7afea1b6" path="/var/lib/kubelet/pods/effe7aa7-db6e-47c6-97e9-b9ab7afea1b6/volumes" Mar 18 14:22:06 crc kubenswrapper[4912]: I0318 14:22:06.827345 4912 scope.go:117] "RemoveContainer" containerID="3d3047b47a02dbbbb03aa4f3df823985627afd9b1e2793856b7327f89a3a71cc" Mar 18 14:22:18 crc kubenswrapper[4912]: I0318 14:22:18.229375 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:22:18 crc kubenswrapper[4912]: E0318 14:22:18.230379 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:22:31 crc kubenswrapper[4912]: I0318 14:22:31.231293 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:22:31 crc kubenswrapper[4912]: E0318 14:22:31.232801 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.986615 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:22:38 crc kubenswrapper[4912]: E0318 14:22:38.988405 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a2eef2-104c-46be-ae21-de8f5817f558" containerName="oc" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.988427 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a2eef2-104c-46be-ae21-de8f5817f558" containerName="oc" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.988785 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a2eef2-104c-46be-ae21-de8f5817f558" containerName="oc" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.990276 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.992888 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.993757 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.993758 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 14:22:38 crc kubenswrapper[4912]: I0318 14:22:38.993876 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6qzpq" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.000990 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158447 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158513 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158548 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158628 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158663 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98pc2\" (UniqueName: \"kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158795 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.158896 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.159013 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.159254 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262317 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262489 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262531 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262606 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262692 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262737 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262774 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262871 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262923 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98pc2\" (UniqueName: \"kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.262926 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.263481 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.265092 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.265196 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.265468 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.269837 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.270736 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.272600 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.301682 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98pc2\" (UniqueName: \"kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.308278 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.330738 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:22:39 crc kubenswrapper[4912]: I0318 14:22:39.875411 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:22:40 crc kubenswrapper[4912]: I0318 14:22:40.141683 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fab7b705-5ef2-46e6-851d-5c38d246ee55","Type":"ContainerStarted","Data":"c86be2adf5d4ca4a37f2d5b6401938b1e0a33cc393e3671fcf1863de309b6eb9"} Mar 18 14:22:42 crc kubenswrapper[4912]: I0318 14:22:42.246492 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:22:42 crc kubenswrapper[4912]: E0318 14:22:42.247551 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:22:57 crc kubenswrapper[4912]: I0318 14:22:57.228817 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:22:57 crc kubenswrapper[4912]: E0318 14:22:57.230109 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:23:10 crc kubenswrapper[4912]: I0318 14:23:10.229380 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:23:10 crc kubenswrapper[4912]: E0318 14:23:10.230502 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.406002 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.409117 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.422979 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.501522 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zf4\" (UniqueName: \"kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.501993 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.502030 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.604669 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zf4\" (UniqueName: \"kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.604792 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.604825 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.605404 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.605642 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.626647 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zf4\" (UniqueName: \"kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4\") pod \"certified-operators-lm297\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:11 crc kubenswrapper[4912]: I0318 14:23:11.782119 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:13 crc kubenswrapper[4912]: E0318 14:23:13.406544 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 14:23:13 crc kubenswrapper[4912]: E0318 14:23:13.410057 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98pc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(fab7b705-5ef2-46e6-851d-5c38d246ee55): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:23:13 crc kubenswrapper[4912]: E0318 14:23:13.411260 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="fab7b705-5ef2-46e6-851d-5c38d246ee55" Mar 18 14:23:13 crc kubenswrapper[4912]: E0318 14:23:13.589176 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="fab7b705-5ef2-46e6-851d-5c38d246ee55" Mar 18 14:23:14 crc kubenswrapper[4912]: I0318 14:23:14.116400 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:14 crc kubenswrapper[4912]: I0318 14:23:14.601610 4912 generic.go:334] "Generic (PLEG): container finished" podID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerID="1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077" exitCode=0 Mar 18 14:23:14 crc kubenswrapper[4912]: I0318 14:23:14.601737 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerDied","Data":"1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077"} Mar 18 14:23:14 crc kubenswrapper[4912]: I0318 14:23:14.602206 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerStarted","Data":"755bc03b9f7ce57a204486bda88ca301a41999eaef2fefbe8b49f49a63a0ce70"} Mar 18 14:23:14 crc kubenswrapper[4912]: I0318 14:23:14.605181 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:23:15 crc kubenswrapper[4912]: I0318 14:23:15.616804 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerStarted","Data":"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b"} Mar 18 14:23:17 crc kubenswrapper[4912]: I0318 14:23:17.656302 4912 generic.go:334] "Generic (PLEG): container finished" podID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerID="38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b" exitCode=0 Mar 18 14:23:17 crc kubenswrapper[4912]: I0318 14:23:17.656405 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerDied","Data":"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b"} Mar 18 14:23:19 crc kubenswrapper[4912]: I0318 14:23:19.685972 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerStarted","Data":"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8"} Mar 18 14:23:19 crc kubenswrapper[4912]: I0318 14:23:19.716310 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lm297" podStartSLOduration=4.704453733 podStartE2EDuration="8.71628921s" podCreationTimestamp="2026-03-18 14:23:11 +0000 UTC" firstStartedPulling="2026-03-18 14:23:14.604882889 +0000 UTC m=+4843.064310314" lastFinishedPulling="2026-03-18 14:23:18.616718366 +0000 UTC m=+4847.076145791" observedRunningTime="2026-03-18 14:23:19.704928742 +0000 UTC m=+4848.164356167" watchObservedRunningTime="2026-03-18 14:23:19.71628921 +0000 UTC m=+4848.175716635" Mar 18 14:23:21 crc kubenswrapper[4912]: I0318 14:23:21.783227 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:21 crc kubenswrapper[4912]: I0318 14:23:21.784358 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:21 crc kubenswrapper[4912]: I0318 14:23:21.844525 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:24 crc kubenswrapper[4912]: I0318 14:23:24.228276 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:23:24 crc kubenswrapper[4912]: E0318 14:23:24.228963 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:23:28 crc kubenswrapper[4912]: I0318 14:23:28.832407 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fab7b705-5ef2-46e6-851d-5c38d246ee55","Type":"ContainerStarted","Data":"43990ccac5d94e76aadc376be4e39c15f7e1069f37d2b8aa71d5d9ff32e9f697"} Mar 18 14:23:28 crc kubenswrapper[4912]: I0318 14:23:28.868589 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.018440747 podStartE2EDuration="51.868566016s" podCreationTimestamp="2026-03-18 14:22:37 +0000 UTC" firstStartedPulling="2026-03-18 14:22:39.883489029 +0000 UTC m=+4808.342916454" lastFinishedPulling="2026-03-18 14:23:26.733614298 +0000 UTC m=+4855.193041723" observedRunningTime="2026-03-18 14:23:28.854625128 +0000 UTC m=+4857.314052563" watchObservedRunningTime="2026-03-18 14:23:28.868566016 +0000 UTC m=+4857.327993441" Mar 18 14:23:31 crc kubenswrapper[4912]: I0318 14:23:31.837245 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:31 crc kubenswrapper[4912]: I0318 14:23:31.905737 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:31 crc kubenswrapper[4912]: I0318 14:23:31.906008 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lm297" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="registry-server" containerID="cri-o://cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8" gracePeriod=2 Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.478240 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.572557 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities\") pod \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.573071 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content\") pod \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.573213 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zf4\" (UniqueName: \"kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4\") pod \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\" (UID: \"f822055a-ab0b-4ae0-aca0-d3dc4adb4284\") " Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.575964 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities" (OuterVolumeSpecName: "utilities") pod "f822055a-ab0b-4ae0-aca0-d3dc4adb4284" (UID: "f822055a-ab0b-4ae0-aca0-d3dc4adb4284"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.597339 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4" (OuterVolumeSpecName: "kube-api-access-j2zf4") pod "f822055a-ab0b-4ae0-aca0-d3dc4adb4284" (UID: "f822055a-ab0b-4ae0-aca0-d3dc4adb4284"). InnerVolumeSpecName "kube-api-access-j2zf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.643406 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f822055a-ab0b-4ae0-aca0-d3dc4adb4284" (UID: "f822055a-ab0b-4ae0-aca0-d3dc4adb4284"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.676767 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.676844 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.676865 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zf4\" (UniqueName: \"kubernetes.io/projected/f822055a-ab0b-4ae0-aca0-d3dc4adb4284-kube-api-access-j2zf4\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.896831 4912 generic.go:334] "Generic (PLEG): container finished" podID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerID="cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8" exitCode=0 Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.896933 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lm297" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.896964 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerDied","Data":"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8"} Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.897087 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lm297" event={"ID":"f822055a-ab0b-4ae0-aca0-d3dc4adb4284","Type":"ContainerDied","Data":"755bc03b9f7ce57a204486bda88ca301a41999eaef2fefbe8b49f49a63a0ce70"} Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.897119 4912 scope.go:117] "RemoveContainer" containerID="cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.940708 4912 scope.go:117] "RemoveContainer" containerID="38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b" Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.953743 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.963749 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lm297"] Mar 18 14:23:32 crc kubenswrapper[4912]: I0318 14:23:32.971986 4912 scope.go:117] "RemoveContainer" containerID="1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.046837 4912 scope.go:117] "RemoveContainer" containerID="cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8" Mar 18 14:23:33 crc kubenswrapper[4912]: E0318 14:23:33.047782 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8\": container with ID starting with cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8 not found: ID does not exist" containerID="cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.047910 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8"} err="failed to get container status \"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8\": rpc error: code = NotFound desc = could not find container \"cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8\": container with ID starting with cab63d8771746a20cdb9fb3af6e65f3ef553e87520c911c1c7e57b3d8ed05fe8 not found: ID does not exist" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.047995 4912 scope.go:117] "RemoveContainer" containerID="38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b" Mar 18 14:23:33 crc kubenswrapper[4912]: E0318 14:23:33.048705 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b\": container with ID starting with 38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b not found: ID does not exist" containerID="38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.048774 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b"} err="failed to get container status \"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b\": rpc error: code = NotFound desc = could not find container \"38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b\": container with ID starting with 38ea467fedead4895915687fe41e917842d8d6caf0d926de4b9e64ecad5cc56b not found: ID does not exist" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.048821 4912 scope.go:117] "RemoveContainer" containerID="1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077" Mar 18 14:23:33 crc kubenswrapper[4912]: E0318 14:23:33.049345 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077\": container with ID starting with 1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077 not found: ID does not exist" containerID="1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077" Mar 18 14:23:33 crc kubenswrapper[4912]: I0318 14:23:33.049377 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077"} err="failed to get container status \"1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077\": rpc error: code = NotFound desc = could not find container \"1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077\": container with ID starting with 1dc4e4dde542ee6981149cd7d3a9ba9d0f907afa0975401fb45e1be04d2f6077 not found: ID does not exist" Mar 18 14:23:34 crc kubenswrapper[4912]: I0318 14:23:34.249034 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" path="/var/lib/kubelet/pods/f822055a-ab0b-4ae0-aca0-d3dc4adb4284/volumes" Mar 18 14:23:39 crc kubenswrapper[4912]: I0318 14:23:39.229237 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:23:39 crc kubenswrapper[4912]: E0318 14:23:39.230372 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:23:52 crc kubenswrapper[4912]: I0318 14:23:52.241137 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:23:52 crc kubenswrapper[4912]: E0318 14:23:52.242908 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.163606 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564064-bn24n"] Mar 18 14:24:00 crc kubenswrapper[4912]: E0318 14:24:00.164925 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="extract-utilities" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.164947 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="extract-utilities" Mar 18 14:24:00 crc kubenswrapper[4912]: E0318 14:24:00.165010 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="registry-server" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.165020 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="registry-server" Mar 18 14:24:00 crc kubenswrapper[4912]: E0318 14:24:00.165076 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="extract-content" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.165088 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="extract-content" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.165395 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="f822055a-ab0b-4ae0-aca0-d3dc4adb4284" containerName="registry-server" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.166870 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.171605 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.172183 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.172564 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.182684 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-bn24n"] Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.321907 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jbx\" (UniqueName: \"kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx\") pod \"auto-csr-approver-29564064-bn24n\" (UID: \"7381c303-a98e-4755-91f4-beab8d1ab273\") " pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.425642 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jbx\" (UniqueName: \"kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx\") pod \"auto-csr-approver-29564064-bn24n\" (UID: \"7381c303-a98e-4755-91f4-beab8d1ab273\") " pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.453950 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jbx\" (UniqueName: \"kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx\") pod \"auto-csr-approver-29564064-bn24n\" (UID: \"7381c303-a98e-4755-91f4-beab8d1ab273\") " pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:00 crc kubenswrapper[4912]: I0318 14:24:00.505330 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:01 crc kubenswrapper[4912]: I0318 14:24:01.062082 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-bn24n"] Mar 18 14:24:01 crc kubenswrapper[4912]: I0318 14:24:01.288063 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-bn24n" event={"ID":"7381c303-a98e-4755-91f4-beab8d1ab273","Type":"ContainerStarted","Data":"85126aeef626070624c26cd1795ebad953d7b624178417953f98eb6833f20baa"} Mar 18 14:24:04 crc kubenswrapper[4912]: I0318 14:24:04.349245 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-bn24n" event={"ID":"7381c303-a98e-4755-91f4-beab8d1ab273","Type":"ContainerStarted","Data":"ec604b75259b9429055a65227930f180a6208f345639a1334cb863b306e31875"} Mar 18 14:24:04 crc kubenswrapper[4912]: I0318 14:24:04.370019 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564064-bn24n" podStartSLOduration=2.8428436599999998 podStartE2EDuration="4.370002663s" podCreationTimestamp="2026-03-18 14:24:00 +0000 UTC" firstStartedPulling="2026-03-18 14:24:01.065132921 +0000 UTC m=+4889.524560346" lastFinishedPulling="2026-03-18 14:24:02.592291914 +0000 UTC m=+4891.051719349" observedRunningTime="2026-03-18 14:24:04.366885229 +0000 UTC m=+4892.826312674" watchObservedRunningTime="2026-03-18 14:24:04.370002663 +0000 UTC m=+4892.829430088" Mar 18 14:24:05 crc kubenswrapper[4912]: I0318 14:24:05.362438 4912 generic.go:334] "Generic (PLEG): container finished" podID="7381c303-a98e-4755-91f4-beab8d1ab273" containerID="ec604b75259b9429055a65227930f180a6208f345639a1334cb863b306e31875" exitCode=0 Mar 18 14:24:05 crc kubenswrapper[4912]: I0318 14:24:05.362520 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-bn24n" event={"ID":"7381c303-a98e-4755-91f4-beab8d1ab273","Type":"ContainerDied","Data":"ec604b75259b9429055a65227930f180a6208f345639a1334cb863b306e31875"} Mar 18 14:24:07 crc kubenswrapper[4912]: I0318 14:24:07.228285 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:24:07 crc kubenswrapper[4912]: E0318 14:24:07.229573 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:24:07 crc kubenswrapper[4912]: I0318 14:24:07.566797 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:07 crc kubenswrapper[4912]: I0318 14:24:07.704147 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jbx\" (UniqueName: \"kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx\") pod \"7381c303-a98e-4755-91f4-beab8d1ab273\" (UID: \"7381c303-a98e-4755-91f4-beab8d1ab273\") " Mar 18 14:24:07 crc kubenswrapper[4912]: I0318 14:24:07.713229 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx" (OuterVolumeSpecName: "kube-api-access-k8jbx") pod "7381c303-a98e-4755-91f4-beab8d1ab273" (UID: "7381c303-a98e-4755-91f4-beab8d1ab273"). InnerVolumeSpecName "kube-api-access-k8jbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:07 crc kubenswrapper[4912]: I0318 14:24:07.807766 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jbx\" (UniqueName: \"kubernetes.io/projected/7381c303-a98e-4755-91f4-beab8d1ab273-kube-api-access-k8jbx\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:08 crc kubenswrapper[4912]: I0318 14:24:08.424174 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-bn24n" event={"ID":"7381c303-a98e-4755-91f4-beab8d1ab273","Type":"ContainerDied","Data":"85126aeef626070624c26cd1795ebad953d7b624178417953f98eb6833f20baa"} Mar 18 14:24:08 crc kubenswrapper[4912]: I0318 14:24:08.424267 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85126aeef626070624c26cd1795ebad953d7b624178417953f98eb6833f20baa" Mar 18 14:24:08 crc kubenswrapper[4912]: I0318 14:24:08.424402 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-bn24n" Mar 18 14:24:08 crc kubenswrapper[4912]: I0318 14:24:08.652305 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-srdzd"] Mar 18 14:24:08 crc kubenswrapper[4912]: I0318 14:24:08.671639 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-srdzd"] Mar 18 14:24:10 crc kubenswrapper[4912]: I0318 14:24:10.246473 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a07a877-c409-4ec5-a89b-526735d5c1a7" path="/var/lib/kubelet/pods/6a07a877-c409-4ec5-a89b-526735d5c1a7/volumes" Mar 18 14:24:18 crc kubenswrapper[4912]: I0318 14:24:18.229190 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:24:18 crc kubenswrapper[4912]: E0318 14:24:18.230724 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:24:32 crc kubenswrapper[4912]: I0318 14:24:32.238545 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:24:32 crc kubenswrapper[4912]: E0318 14:24:32.241097 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:24:43 crc kubenswrapper[4912]: I0318 14:24:43.229121 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:24:43 crc kubenswrapper[4912]: E0318 14:24:43.229941 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:24:55 crc kubenswrapper[4912]: I0318 14:24:55.227827 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:24:55 crc kubenswrapper[4912]: E0318 14:24:55.228817 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:25:06 crc kubenswrapper[4912]: I0318 14:25:06.229540 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:25:06 crc kubenswrapper[4912]: E0318 14:25:06.230509 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:25:07 crc kubenswrapper[4912]: I0318 14:25:07.057880 4912 scope.go:117] "RemoveContainer" containerID="5ad728797ae56c6e122505dcd6466eca70b82e4da1f03068956a8c73de12df29" Mar 18 14:25:20 crc kubenswrapper[4912]: I0318 14:25:20.230223 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:25:21 crc kubenswrapper[4912]: I0318 14:25:21.447947 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a"} Mar 18 14:25:50 crc kubenswrapper[4912]: I0318 14:25:50.297374 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.160288 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.160284 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.174324 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.174288 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.197700 4912 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-jkd5w container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.197788 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podUID="3e51cc8b-d69c-4be9-8b12-c1a10c653621" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.251149 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.251256 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.251173 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.251378 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.521406 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.521497 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.521579 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.521715 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.938280 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:01 crc kubenswrapper[4912]: I0318 14:26:01.938321 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:02 crc kubenswrapper[4912]: I0318 14:26:02.597187 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:02 crc kubenswrapper[4912]: I0318 14:26:02.597852 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:02 crc kubenswrapper[4912]: I0318 14:26:02.946351 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:02 crc kubenswrapper[4912]: I0318 14:26:02.946436 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.081370 4912 trace.go:236] Trace[714607121]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-d8c92" (18-Mar-2026 14:26:01.770) (total time: 1297ms): Mar 18 14:26:03 crc kubenswrapper[4912]: Trace[714607121]: [1.29786799s] [1.29786799s] END Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.184191 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.184315 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.315677 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.316159 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.433745 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.434312 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.453171 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.453275 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.806886 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564066-lmtc2"] Mar 18 14:26:03 crc kubenswrapper[4912]: E0318 14:26:03.809181 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7381c303-a98e-4755-91f4-beab8d1ab273" containerName="oc" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.809201 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="7381c303-a98e-4755-91f4-beab8d1ab273" containerName="oc" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.811000 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="7381c303-a98e-4755-91f4-beab8d1ab273" containerName="oc" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.818353 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.826194 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.863769 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.864511 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:26:03 crc kubenswrapper[4912]: I0318 14:26:03.937242 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9l8b\" (UniqueName: \"kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b\") pod \"auto-csr-approver-29564066-lmtc2\" (UID: \"54c16250-e6b3-4308-bf15-d4633c661d9e\") " pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:04 crc kubenswrapper[4912]: I0318 14:26:04.041002 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9l8b\" (UniqueName: \"kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b\") pod \"auto-csr-approver-29564066-lmtc2\" (UID: \"54c16250-e6b3-4308-bf15-d4633c661d9e\") " pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:04 crc kubenswrapper[4912]: I0318 14:26:04.130494 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9l8b\" (UniqueName: \"kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b\") pod \"auto-csr-approver-29564066-lmtc2\" (UID: \"54c16250-e6b3-4308-bf15-d4633c661d9e\") " pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:04 crc kubenswrapper[4912]: I0318 14:26:04.202677 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:05 crc kubenswrapper[4912]: I0318 14:26:05.195871 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-lmtc2"] Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.345252 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.348063 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.383388 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.383492 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.384184 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.384260 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.384195 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:06 crc kubenswrapper[4912]: I0318 14:26:06.384326 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:07 crc kubenswrapper[4912]: I0318 14:26:07.734528 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:07 crc kubenswrapper[4912]: I0318 14:26:07.736277 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.542350 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.542434 4912 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-sxhvr container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.543269 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" podUID="5661de32-ecd9-4450-b757-465370105082" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.573478 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.737856 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:08 crc kubenswrapper[4912]: I0318 14:26:08.738631 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.030261 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podUID="ef041eab-e584-4a2a-8008-9a7f07f75f70" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.030345 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.030401 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.116251 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.209305 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.209341 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.473290 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.514410 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.596366 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.679309 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.679570 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.680207 4912 patch_prober.go:28] interesting pod/metrics-server-cdbfdff57-fmktk container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.680241 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.84:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.680324 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.680543 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.680549 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.681531 4912 patch_prober.go:28] interesting pod/metrics-server-cdbfdff57-fmktk container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.681581 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.818447 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.819406 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.902655 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.903310 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.995751 4912 patch_prober.go:28] interesting pod/monitoring-plugin-7bb9f46ccc-95zvh container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:09 crc kubenswrapper[4912]: I0318 14:26:09.995827 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" podUID="cdcc5deb-7e0f-47e2-be3c-ccf9657de44e" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.390312 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.390984 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.391302 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.559769 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.559847 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.559973 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.560101 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.932391 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:10 crc kubenswrapper[4912]: I0318 14:26:10.932658 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.063514 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:11 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:11 crc kubenswrapper[4912]: > Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.063543 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:11 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:11 crc kubenswrapper[4912]: > Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.114291 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.114395 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155328 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155428 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155454 4912 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-jkd5w container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155524 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155529 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podUID="3e51cc8b-d69c-4be9-8b12-c1a10c653621" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155555 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155574 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.155618 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286227 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286261 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286295 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286333 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286337 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286372 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286390 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286359 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286426 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286443 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286695 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.286711 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.302789 4912 patch_prober.go:28] interesting pod/thanos-querier-656c486c6f-q6nhd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.302908 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" podUID="f07400ad-8e47-4209-91f0-dcbdbca254b6" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.343107 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Liveness probe status=failure output="" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.362957 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.480366 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.481000 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.521279 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.521383 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.938290 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:11 crc kubenswrapper[4912]: I0318 14:26:11.938290 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.188799 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:12 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:12 crc kubenswrapper[4912]: > Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.203111 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:12 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:12 crc kubenswrapper[4912]: > Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.597563 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.598178 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.734497 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.741583 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.946237 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:12 crc kubenswrapper[4912]: I0318 14:26:12.946322 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.187682 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.187766 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.266929 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.267065 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.432056 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.432118 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.432173 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.432191 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.454117 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.454186 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.454260 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.454281 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.737674 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 18 14:26:13 crc kubenswrapper[4912]: I0318 14:26:13.737737 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.139209 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.139254 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.139304 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.139348 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.196959 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.197062 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.332058 4912 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.332138 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="78d0ba71-aecb-4e22-a459-c5f690268e0e" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.348423 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.348508 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.397392 4912 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.397515 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="5c4fd206-4176-47ef-9cee-8be6e9ed396f" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:14 crc kubenswrapper[4912]: I0318 14:26:14.974737 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" podUID="7ffd183f-20a4-4586-ac75-597797ada23c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.264795 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.264868 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.264900 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.264958 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.288338 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-g6mtn" podUID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:15 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:15 crc kubenswrapper[4912]: > Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.482500 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" podUID="d96a656e-5436-4af3-b4cd-98c485c402a1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:15 crc kubenswrapper[4912]: I0318 14:26:15.680850 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-g6mtn" podUID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:15 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:15 crc kubenswrapper[4912]: > Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.180313 4912 patch_prober.go:28] interesting pod/loki-operator-controller-manager-867987c6b7-jg2ct container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.45:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.180866 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" podUID="8efdcb68-92df-434c-8446-5be1ef0a94ba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.45:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.299823 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.299948 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.304271 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.304371 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.306307 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.306349 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.306553 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:16 crc kubenswrapper[4912]: I0318 14:26:16.306631 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.113835 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nw2vt container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.113958 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podUID="70dff85c-f45b-431d-83ad-3b7802b15cd3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.114407 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nw2vt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.114439 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podUID="70dff85c-f45b-431d-83ad-3b7802b15cd3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.133379 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.133492 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.134143 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.134252 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.149647 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.153108 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.186849 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098"} pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.188158 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" containerID="cri-o://70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098" gracePeriod=30 Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.308745 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.308852 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.464381 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" podUID="c672e269-a0f9-42e0-964c-ea26f3d86a58" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.10:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.464954 4912 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-lcgrk container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.464976 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" podUID="ffcc0a7f-efff-4a18-8002-7b33a557293c" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.465798 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" podUID="c672e269-a0f9-42e0-964c-ea26f3d86a58" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.10:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.466010 4912 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-lcgrk container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.466051 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" podUID="ffcc0a7f-efff-4a18-8002-7b33a557293c" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.550228 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.550293 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.741201 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.741211 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.815323 4912 patch_prober.go:28] interesting pod/perses-operator-7bb4554dcb-4hc2x container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.97:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.815323 4912 patch_prober.go:28] interesting pod/perses-operator-7bb4554dcb-4hc2x container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.97:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.815434 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" podUID="b1062176-da75-4c7d-a3fc-b5ecee790973" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.97:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:17 crc kubenswrapper[4912]: I0318 14:26:17.815487 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" podUID="b1062176-da75-4c7d-a3fc-b5ecee790973" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.97:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.154767 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.154872 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.396445 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.433189 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.433285 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.433189 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.433352 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.453596 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.453637 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.453696 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.453738 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.537286 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.537286 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.537341 4912 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-sxhvr container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.537592 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" podUID="5661de32-ecd9-4450-b757-465370105082" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.654346 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" podUID="f695b268-a8b7-4b72-a37b-dd342d7d369a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.734308 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.734308 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.734834 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.735058 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.799398 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" podUID="98fed63c-9006-4589-a119-1e25fb115041" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.863649 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="4afb2214-0d5c-469e-8763-580ea6d84b7d" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.18:8080/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:18 crc kubenswrapper[4912]: I0318 14:26:18.896373 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podUID="334b170e-0f84-42b2-81a6-8c469d187fa3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.030485 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podUID="ef041eab-e584-4a2a-8008-9a7f07f75f70" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.030657 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.031082 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podUID="ef041eab-e584-4a2a-8008-9a7f07f75f70" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.073454 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" podUID="f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.196466 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.196476 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" podUID="6ff20347-b4ef-4d01-966c-5ba69dcf546c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.238305 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" podUID="9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.238581 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.238694 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.238772 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.471303 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.627265 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-cllp9" podUID="2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.627454 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.628259 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.628371 4912 patch_prober.go:28] interesting pod/metrics-server-cdbfdff57-fmktk container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.628417 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.669358 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.711220 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.741389 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.777280 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.828285 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.994896 4912 patch_prober.go:28] interesting pod/monitoring-plugin-7bb9f46ccc-95zvh container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:19 crc kubenswrapper[4912]: I0318 14:26:19.994968 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" podUID="cdcc5deb-7e0f-47e2-be3c-ccf9657de44e" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.319638 4912 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-qnf86 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.319747 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" podUID="39a2121c-5ff0-4ff6-84de-f1354552a568" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.39:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.394264 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.394392 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.394672 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.477586 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-tpm8v" podUID="1c9a2194-27ba-4a86-b5c1-e8356c71227f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.478382 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-tpm8v" podUID="1c9a2194-27ba-4a86-b5c1-e8356c71227f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.559881 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.560453 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.560226 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.560832 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.929306 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:20 crc kubenswrapper[4912]: I0318 14:26:20.929455 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.071104 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.071229 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.081811 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.082052 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.084144 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.084269 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.092689 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"71691bab47dd1299c204f36593b2e87a78feafb957d24ebcddb7db7fb06086b0"} pod="openshift-console-operator/console-operator-58897d9998-tjwtn" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.094794 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" containerID="cri-o://71691bab47dd1299c204f36593b2e87a78feafb957d24ebcddb7db7fb06086b0" gracePeriod=30 Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.104818 4912 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-jkd5w container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.105012 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podUID="3e51cc8b-d69c-4be9-8b12-c1a10c653621" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.105186 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287182 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287245 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287290 4912 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zzbbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287483 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287564 4912 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zzbbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287624 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287270 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287656 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287664 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287690 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287685 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287624 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podUID="5037377a-5754-40b3-8ffc-ef8776d54442" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287782 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287697 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287839 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287665 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.287943 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.289563 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"9ca52ef970a8b89cd53174e3362454201e044f89267e4da1ee360b295cf4b197"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.289608 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" containerID="cri-o://9ca52ef970a8b89cd53174e3362454201e044f89267e4da1ee360b295cf4b197" gracePeriod=30 Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.289936 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podUID="5037377a-5754-40b3-8ffc-ef8776d54442" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.342305 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.342388 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.342321 4912 patch_prober.go:28] interesting pod/thanos-querier-656c486c6f-q6nhd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.342539 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" podUID="f07400ad-8e47-4209-91f0-dcbdbca254b6" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.385224 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.385294 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521302 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521361 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521376 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521387 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521424 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.521480 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.522944 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34"} pod="openshift-ingress/router-default-5444994796-bbvtw" containerMessage="Container router failed liveness probe, will be restarted" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.522986 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" containerID="cri-o://b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34" gracePeriod=10 Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.795690 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zszmc" podUID="23eaceef-5a11-4610-91b0-6ca3c42c167f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.938375 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.938338 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.939178 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/speaker-zw7wz" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.943091 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zw7wz" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.947439 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="speaker" containerStatusID={"Type":"cri-o","ID":"1865d794c53fc9c0b08efce4bbd193e08db3f1df48dc3e4396739edf9e48cbf4"} pod="metallb-system/speaker-zw7wz" containerMessage="Container speaker failed liveness probe, will be restarted" Mar 18 14:26:21 crc kubenswrapper[4912]: I0318 14:26:21.947535 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" containerID="cri-o://1865d794c53fc9c0b08efce4bbd193e08db3f1df48dc3e4396739edf9e48cbf4" gracePeriod=2 Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.085321 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.085393 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.106632 4912 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-jkd5w container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.106723 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podUID="3e51cc8b-d69c-4be9-8b12-c1a10c653621" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.130546 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.130621 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.297383 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.298119 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.582579 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.582579 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.582580 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.585193 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.597474 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.597627 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.598796 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.736813 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.737606 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.946449 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.946533 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.946645 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 14:26:22 crc kubenswrapper[4912]: I0318 14:26:22.987411 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zw7wz" podUID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.181146 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.181252 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.181394 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.267492 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.267606 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.267751 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.418565 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.418687 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.433457 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.433601 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.433625 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.433708 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.453888 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.454114 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.454605 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.454764 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.601428 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.602182 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.736078 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.736442 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.739015 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.948336 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:23 crc kubenswrapper[4912]: I0318 14:26:23.948448 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.182581 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.182701 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.196207 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.196327 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.278524 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.278611 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.333003 4912 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.333450 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="78d0ba71-aecb-4e22-a459-c5f690268e0e" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.347324 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.347426 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.398022 4912 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.398125 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="5c4fd206-4176-47ef-9cee-8be6e9ed396f" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.503342 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" podUID="b7ec4270-842e-49cb-8d22-16df7b212443" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.503441 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" podUID="b7ec4270-842e-49cb-8d22-16df7b212443" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.741069 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-sfv6d" podUID="4375d78c-761e-4691-9da9-89f56373ea76" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.741180 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-sfv6d" podUID="4375d78c-761e-4691-9da9-89f56373ea76" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.741394 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:24 crc kubenswrapper[4912]: I0318 14:26:24.741528 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.014216 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" podUID="7ffd183f-20a4-4586-ac75-597797ada23c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.014386 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" podUID="7ffd183f-20a4-4586-ac75-597797ada23c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.131346 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.131467 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.266007 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.266091 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.267290 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.267366 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.424433 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" podUID="d96a656e-5436-4af3-b4cd-98c485c402a1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.480725 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" podUID="d96a656e-5436-4af3-b4cd-98c485c402a1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.485444 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-tjwtn_55da9bcd-23b6-4ea7-8f43-26c43d05a9e3/console-operator/0.log" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.485708 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" event={"ID":"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3","Type":"ContainerDied","Data":"71691bab47dd1299c204f36593b2e87a78feafb957d24ebcddb7db7fb06086b0"} Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.487938 4912 generic.go:334] "Generic (PLEG): container finished" podID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerID="71691bab47dd1299c204f36593b2e87a78feafb957d24ebcddb7db7fb06086b0" exitCode=1 Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.492750 4912 generic.go:334] "Generic (PLEG): container finished" podID="c6af6424-58bd-4c40-a86c-15627b762a9a" containerID="1865d794c53fc9c0b08efce4bbd193e08db3f1df48dc3e4396739edf9e48cbf4" exitCode=137 Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.492799 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zw7wz" event={"ID":"c6af6424-58bd-4c40-a86c-15627b762a9a","Type":"ContainerDied","Data":"1865d794c53fc9c0b08efce4bbd193e08db3f1df48dc3e4396739edf9e48cbf4"} Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.515880 4912 patch_prober.go:28] interesting pod/apiserver-76f77b778f-t75hw container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/readyz?exclude=etcd&exclude=etcd-readiness\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.515956 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-t75hw" podUID="ff22e507-73a7-44b1-9eab-c704fb998092" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.38:8443/readyz?exclude=etcd&exclude=etcd-readiness\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.738050 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-pr4zx" podUID="b5944127-745d-42f9-83c2-d448435da4c9" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.738083 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.738047 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-pr4zx" podUID="b5944127-745d-42f9-83c2-d448435da4c9" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.738672 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.741851 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"1e366ffa0c569c413306141c62fa70428165be06b91a7f09bb4263f76a862376"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 18 14:26:25 crc kubenswrapper[4912]: I0318 14:26:25.741944 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerName="ceilometer-central-agent" containerID="cri-o://1e366ffa0c569c413306141c62fa70428165be06b91a7f09bb4263f76a862376" gracePeriod=30 Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.205538 4912 trace.go:236] Trace[1926210929]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (18-Mar-2026 14:26:18.292) (total time: 7908ms): Mar 18 14:26:26 crc kubenswrapper[4912]: Trace[1926210929]: [7.908747203s] [7.908747203s] END Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.205569 4912 trace.go:236] Trace[339060713]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (18-Mar-2026 14:26:19.524) (total time: 6676ms): Mar 18 14:26:26 crc kubenswrapper[4912]: Trace[339060713]: [6.676530929s] [6.676530929s] END Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.206080 4912 trace.go:236] Trace[163141021]: "Calculate volume metrics of storage for pod minio-dev/minio" (18-Mar-2026 14:26:19.414) (total time: 6786ms): Mar 18 14:26:26 crc kubenswrapper[4912]: Trace[163141021]: [6.786781163s] [6.786781163s] END Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.221269 4912 patch_prober.go:28] interesting pod/loki-operator-controller-manager-867987c6b7-jg2ct container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.45:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.221299 4912 patch_prober.go:28] interesting pod/loki-operator-controller-manager-867987c6b7-jg2ct container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.45:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.221352 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" podUID="8efdcb68-92df-434c-8446-5be1ef0a94ba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.45:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.221382 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" podUID="8efdcb68-92df-434c-8446-5be1ef0a94ba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.45:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.251976 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.252065 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.298496 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.298574 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.298600 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.298684 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.298667 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.301458 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"4317cf97fdfbf35b04a793d8a339ef489d493e75a32877049563389b71ba16e5"} pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.301530 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" containerID="cri-o://4317cf97fdfbf35b04a793d8a339ef489d493e75a32877049563389b71ba16e5" gracePeriod=30 Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.302951 4912 patch_prober.go:28] interesting pod/thanos-querier-656c486c6f-q6nhd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.303074 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" podUID="f07400ad-8e47-4209-91f0-dcbdbca254b6" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.305104 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.305169 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.305233 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.307049 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"7dae4616235a902ccd3a026d9bf2db75cb2b0d8e0a9ceba0743173e72fddbb0e"} pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.307105 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" containerID="cri-o://7dae4616235a902ccd3a026d9bf2db75cb2b0d8e0a9ceba0743173e72fddbb0e" gracePeriod=30 Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.308862 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.308934 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.732499 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.735178 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zszmc" podUID="23eaceef-5a11-4610-91b0-6ca3c42c167f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.738424 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-g6mtn" podUID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.738547 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-g6mtn" podUID="be01ffc1-29df-445f-b0e7-6dd0e80c6297" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.745315 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:26 crc kubenswrapper[4912]: I0318 14:26:26.745405 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:26 crc kubenswrapper[4912]: E0318 14:26:26.924225 4912 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.111409 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nw2vt container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.111511 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podUID="70dff85c-f45b-431d-83ad-3b7802b15cd3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.111369 4912 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nw2vt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.111600 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nw2vt" podUID="70dff85c-f45b-431d-83ad-3b7802b15cd3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.309214 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.309287 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.391286 4912 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-lcgrk container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.391350 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" podUID="ffcc0a7f-efff-4a18-8002-7b33a557293c" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.732196 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/healthy\": context deadline exceeded" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.734618 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.743251 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.747209 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a406878a-6e90-4c47-8e23-875349b55b1d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.759369 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.761145 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.761208 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.767248 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.769758 4912 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-lcgrk container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.769863 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-lcgrk" podUID="ffcc0a7f-efff-4a18-8002-7b33a557293c" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.96:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.812610 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" podUID="c672e269-a0f9-42e0-964c-ea26f3d86a58" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.10:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.854760 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cxrkp" podUID="c672e269-a0f9-42e0-964c-ea26f3d86a58" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.10:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.897529 4912 patch_prober.go:28] interesting pod/perses-operator-7bb4554dcb-4hc2x container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.97:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:27 crc kubenswrapper[4912]: I0318 14:26:27.897617 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-7bb4554dcb-4hc2x" podUID="b1062176-da75-4c7d-a3fc-b5ecee790973" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.97:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.078185 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-4r8tg" podUID="3abcfc85-e792-4ba8-a6c2-db7130b1f423" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.130580 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.131063 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.734753 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.735361 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.734843 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.735843 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.734747 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.735975 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.737544 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"2c12695758a8e030ab1bbae40231e9e1240ab28ee6b2233948263853940f7e9f"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.898519 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podUID="2faefcc2-b6a3-4dee-a077-af88038f3565" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.939655 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podUID="2faefcc2-b6a3-4dee-a077-af88038f3565" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.980220 4912 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-sxhvr container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.980281 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.980300 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" podUID="5661de32-ecd9-4450-b757-465370105082" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.980447 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.990904 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"1a2abf407dbb7a4d45cd9bea2e08dcd690fb6c3f51bcc6ef05aeb8e10c01d4c3"} pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 18 14:26:28 crc kubenswrapper[4912]: I0318 14:26:28.991005 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" podUID="5661de32-ecd9-4450-b757-465370105082" containerName="authentication-operator" containerID="cri-o://1a2abf407dbb7a4d45cd9bea2e08dcd690fb6c3f51bcc6ef05aeb8e10c01d4c3" gracePeriod=30 Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.062323 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" podUID="98fed63c-9006-4589-a119-1e25fb115041" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.235203 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.235252 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.235449 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.317322 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="4afb2214-0d5c-469e-8763-580ea6d84b7d" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.18:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.317371 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="4afb2214-0d5c-469e-8763-580ea6d84b7d" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.18:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.317324 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podUID="334b170e-0f84-42b2-81a6-8c469d187fa3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.317377 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.317600 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.440251 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.440778 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.522391 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" podUID="f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.522536 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" podUID="f695b268-a8b7-4b72-a37b-dd342d7d369a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.605342 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" podUID="9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.647661 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.647662 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" podUID="6ff20347-b4ef-4d01-966c-5ba69dcf546c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.730285 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.730285 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" podUID="98fed63c-9006-4589-a119-1e25fb115041" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.730433 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" podUID="f695b268-a8b7-4b72-a37b-dd342d7d369a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.731365 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" podUID="334b170e-0f84-42b2-81a6-8c469d187fa3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.741625 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.815345 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podUID="ef041eab-e584-4a2a-8008-9a7f07f75f70" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.815529 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.815694 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.815370 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" podUID="45ef8022-adf2-46bc-a112-a5532880c080" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.853729 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zw7wz" event={"ID":"c6af6424-58bd-4c40-a86c-15627b762a9a","Type":"ContainerStarted","Data":"a7301e871c43964f7dec1808d5bc3222e8ec9c2fe0d939fde8384d7185a442ac"} Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.854650 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zw7wz" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.863799 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-tjwtn_55da9bcd-23b6-4ea7-8f43-26c43d05a9e3/console-operator/0.log" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.864306 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" event={"ID":"55da9bcd-23b6-4ea7-8f43-26c43d05a9e3","Type":"ContainerStarted","Data":"c3cf047a1184b22349b84ba22b4db16bdd554af04b82e3f909dcca8256979c00"} Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.864856 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.899371 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.899584 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.899585 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" podUID="f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.981330 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.981398 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.981529 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.981903 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.982192 4912 patch_prober.go:28] interesting pod/metrics-server-cdbfdff57-fmktk container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.982288 4912 patch_prober.go:28] interesting pod/metrics-server-cdbfdff57-fmktk container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.982277 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.982330 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.84:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.982399 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.990485 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"1e7e91974f3cc6458f781703e9c5006f59a4acd8cce645e7358e0c06d4c2897d"} pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.990591 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" containerID="cri-o://1e7e91974f3cc6458f781703e9c5006f59a4acd8cce645e7358e0c06d4c2897d" gracePeriod=2 Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.990939 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.991598 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"a8371e44cd45687f253f6ae01e1fd665429fe9f448934a897b3b4aed3a2a1268"} pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 18 14:26:29 crc kubenswrapper[4912]: I0318 14:26:29.991682 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" podUID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerName="metrics-server" containerID="cri-o://a8371e44cd45687f253f6ae01e1fd665429fe9f448934a897b3b4aed3a2a1268" gracePeriod=170 Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.064369 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.064461 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" podUID="6ff20347-b4ef-4d01-966c-5ba69dcf546c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.146284 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.146310 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.146482 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.146545 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.238560 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" podUID="45ef8022-adf2-46bc-a112-a5532880c080" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.320256 4912 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-qnf86 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.320364 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" podUID="39a2121c-5ff0-4ff6-84de-f1354552a568" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.39:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322508 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322461 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322786 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" podUID="9ead324e-7891-4059-9d70-90462b2cc852" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.90:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322893 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322948 4912 patch_prober.go:28] interesting pod/monitoring-plugin-7bb9f46ccc-95zvh container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.322972 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" podUID="cdcc5deb-7e0f-47e2-be3c-ccf9657de44e" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323018 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323026 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" podUID="e5f93e56-4ca9-413c-9954-f94f182b6606" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323261 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323420 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323604 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323638 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323746 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323767 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.323824 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.324032 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.324057 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.324130 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.324371 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.324396 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.447241 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.447259 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.447354 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.447292 4912 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-qnf86 container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.39:8443/livez?exclude=etcd\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.447452 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnf86" podUID="39a2121c-5ff0-4ff6-84de-f1354552a568" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.39:8443/livez?exclude=etcd\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.451545 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.453773 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"6ddb7fb8b94f5302e47ce0b36cc2f0df9ccb93c03581f10ead06f49e4d8a813b"} pod="metallb-system/frr-k8s-ngzqk" containerMessage="Container controller failed liveness probe, will be restarted" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.453936 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" containerID="cri-o://6ddb7fb8b94f5302e47ce0b36cc2f0df9ccb93c03581f10ead06f49e4d8a813b" gracePeriod=2 Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.488461 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.488926 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.488979 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.572326 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-tpm8v" podUID="1c9a2194-27ba-4a86-b5c1-e8356c71227f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.572492 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.572669 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.577606 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.577679 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.619978 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-tpm8v" podUID="1c9a2194-27ba-4a86-b5c1-e8356c71227f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620147 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620246 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620321 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620678 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" podUID="8b12ea77-cfde-4e3d-bdc7-04c350f17c09" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620007 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.620826 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.621496 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.622591 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"2d3916fa1a525ce458866b8ee169fe7859782f6da0b61c3d51454e00f476b357"} pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.879108 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" event={"ID":"389bca57-3d65-4ed4-8b0d-9c09c58ecf99","Type":"ContainerDied","Data":"9ca52ef970a8b89cd53174e3362454201e044f89267e4da1ee360b295cf4b197"} Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.897378 4912 generic.go:334] "Generic (PLEG): container finished" podID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerID="9ca52ef970a8b89cd53174e3362454201e044f89267e4da1ee360b295cf4b197" exitCode=0 Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.957348 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" podUID="ef041eab-e584-4a2a-8008-9a7f07f75f70" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.957392 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.957598 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.999271 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.999305 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" podUID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:30 crc kubenswrapper[4912]: I0318 14:26:30.999405 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:30.999454 4912 patch_prober.go:28] interesting pod/console-operator-58897d9998-tjwtn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:30.999518 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" podUID="55da9bcd-23b6-4ea7-8f43-26c43d05a9e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.105306 4912 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-jkd5w container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.105466 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" podUID="3e51cc8b-d69c-4be9-8b12-c1a10c653621" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.66:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.130302 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.130942 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.231160 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" podUID="3821e364-991e-4a58-88e6-cf499d12aa70" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.231193 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" podUID="35ae7eba-4b8f-43ac-b828-5cbc84fed044" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.302949 4912 patch_prober.go:28] interesting pod/thanos-querier-656c486c6f-q6nhd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.303108 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-656c486c6f-q6nhd" podUID="f07400ad-8e47-4209-91f0-dcbdbca254b6" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.82:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313368 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313436 4912 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zzbbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313377 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313541 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313545 4912 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zzbbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313604 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313469 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313646 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313665 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313635 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podUID="5037377a-5754-40b3-8ffc-ef8776d54442" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313529 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zzbbb" podUID="5037377a-5754-40b3-8ffc-ef8776d54442" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313705 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313738 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313742 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.313755 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.315438 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"b9652941f65b158fd0ba0242c4dc87bb28a99ed545fbc984d01319fb5e050100"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.315503 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" containerID="cri-o://b9652941f65b158fd0ba0242c4dc87bb28a99ed545fbc984d01319fb5e050100" gracePeriod=30 Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.324394 4912 patch_prober.go:28] interesting pod/monitoring-plugin-7bb9f46ccc-95zvh container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.324458 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" podUID="cdcc5deb-7e0f-47e2-be3c-ccf9657de44e" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.85:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.396265 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.396378 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.396489 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.396628 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.396748 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.478454 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.478559 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.519445 4912 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.519466 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" podUID="692fb335-57d8-465c-b7ef-d94c53f84523" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.519548 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.519706 4912 patch_prober.go:28] interesting pod/downloads-7954f5f757-2ghlz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.519855 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2ghlz" podUID="1b34ff88-74eb-45ce-acd4-3b7b272e1747" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.578343 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" podUID="19eca4e0-1677-4af5-993a-4cd45173287e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.578478 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.578641 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.637378 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Liveness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.637505 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.637609 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.640379 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"902fe890ebba0eba046de3dd16b6f2a5f9b183abc29577c612d48926a743e605"} pod="openshift-console/console-c665c8f96-4wcs8" containerMessage="Container console failed liveness probe, will be restarted" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.735174 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zszmc" podUID="23eaceef-5a11-4610-91b0-6ca3c42c167f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.735729 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.912733 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="package-server-manager" containerStatusID={"Type":"cri-o","ID":"192e1c96eff03e0f62710194722b73674fcb0688acd78aa035037f1792a605b1"} pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" containerMessage="Container package-server-manager failed liveness probe, will be restarted" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.912841 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" containerID="cri-o://192e1c96eff03e0f62710194722b73674fcb0688acd78aa035037f1792a605b1" gracePeriod=30 Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.912830 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"a3882757bbf19032efcdfc98738b4713ab8eb41ae9cfa9af6fc0a0163e543a3b"} pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Mar 18 14:26:31 crc kubenswrapper[4912]: I0318 14:26:31.912944 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" containerID="cri-o://a3882757bbf19032efcdfc98738b4713ab8eb41ae9cfa9af6fc0a0163e543a3b" gracePeriod=10 Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.001448 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.108191 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zszmc" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.356469 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.356626 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.356443 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.356742 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.598081 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.598168 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.733122 4912 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-d8c92 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.69:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.733567 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" podUID="065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.69:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.735358 4912 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-d8c92 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.69:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.735452 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-d8c92" podUID="065eeb41-b1d9-4ea8-9f4f-675a5bce6c3b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.69:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.945992 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-bbvtw_7feb8268-723e-408b-b800-744481779d38/router/0.log" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.946083 4912 generic.go:334] "Generic (PLEG): container finished" podID="7feb8268-723e-408b-b800-744481779d38" containerID="b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34" exitCode=137 Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.946172 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bbvtw" event={"ID":"7feb8268-723e-408b-b800-744481779d38","Type":"ContainerDied","Data":"b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34"} Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.947416 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.947470 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.951607 4912 generic.go:334] "Generic (PLEG): container finished" podID="9ead324e-7891-4059-9d70-90462b2cc852" containerID="1e7e91974f3cc6458f781703e9c5006f59a4acd8cce645e7358e0c06d4c2897d" exitCode=137 Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.951713 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" event={"ID":"9ead324e-7891-4059-9d70-90462b2cc852","Type":"ContainerDied","Data":"1e7e91974f3cc6458f781703e9c5006f59a4acd8cce645e7358e0c06d4c2897d"} Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.958515 4912 generic.go:334] "Generic (PLEG): container finished" podID="8efdcb68-92df-434c-8446-5be1ef0a94ba" containerID="530a49354dcf66a5b129a74c75a194004b2b0eb590109f20f74f88253195b02f" exitCode=1 Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.960086 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" event={"ID":"8efdcb68-92df-434c-8446-5be1ef0a94ba","Type":"ContainerDied","Data":"530a49354dcf66a5b129a74c75a194004b2b0eb590109f20f74f88253195b02f"} Mar 18 14:26:32 crc kubenswrapper[4912]: I0318 14:26:32.964464 4912 scope.go:117] "RemoveContainer" containerID="530a49354dcf66a5b129a74c75a194004b2b0eb590109f20f74f88253195b02f" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.044335 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.180910 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.180988 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.268215 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.268657 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.398283 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.398343 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.432886 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.432946 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.432972 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": context deadline exceeded" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.433073 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": context deadline exceeded" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.454980 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": context deadline exceeded" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.455061 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": context deadline exceeded" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.455184 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.455273 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.598742 4912 patch_prober.go:28] interesting pod/console-c665c8f96-4wcs8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.598852 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.729763 4912 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:36216->192.168.126.11:10257: read: connection reset by peer" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.729851 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:36216->192.168.126.11:10257: read: connection reset by peer" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739016 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739021 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739154 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739210 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739263 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739243 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739256 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="544b60c6-45d2-415c-9145-0dc544d78e4a" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739473 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739501 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.739538 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.742148 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00"} pod="openstack-operators/openstack-operator-index-tkt7x" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.742207 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" containerID="cri-o://b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00" gracePeriod=30 Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.742324 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b"} pod="openshift-marketplace/redhat-marketplace-98dk7" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.742376 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" containerID="cri-o://09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b" gracePeriod=30 Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.946706 4912 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-s2ztv container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.50:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.946818 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" podUID="cd78a5ca-41b5-48af-a603-0ac01cbde069" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.979551 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" event={"ID":"389bca57-3d65-4ed4-8b0d-9c09c58ecf99","Type":"ContainerStarted","Data":"407da7b269f65f553fa1ccf602b8512b2279db15fe65c18842047c9dc6d64ca5"} Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.979762 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.980245 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.980308 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.998581 4912 generic.go:334] "Generic (PLEG): container finished" podID="0475f7b9-387c-422d-88c8-90416895b720" containerID="6ddb7fb8b94f5302e47ce0b36cc2f0df9ccb93c03581f10ead06f49e4d8a813b" exitCode=137 Mar 18 14:26:33 crc kubenswrapper[4912]: I0318 14:26:33.998678 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerDied","Data":"6ddb7fb8b94f5302e47ce0b36cc2f0df9ccb93c03581f10ead06f49e4d8a813b"} Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.129975 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.130057 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.182922 4912 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-jgbgh container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.51:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.183019 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" podUID="d49e1c94-aaf7-4502-ad56-46296a08cf03" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.225312 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.225474 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.225730 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.267166 4912 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-tcbf4 container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.267256 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" podUID="95f374dc-f34c-48df-a280-3434f082b6d0" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.332770 4912 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.333382 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="78d0ba71-aecb-4e22-a459-c5f690268e0e" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.333481 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.347554 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.347642 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.347753 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.396875 4912 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.396946 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="5c4fd206-4176-47ef-9cee-8be6e9ed396f" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.397074 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.433125 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-vqfpz container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.433224 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-vqfpz" podUID="86abc7c8-2019-4f25-84a0-f764bc3f10d6" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.454286 4912 patch_prober.go:28] interesting pod/logging-loki-gateway-b5bdf65c4-ldbjt container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.454379 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-b5bdf65c4-ldbjt" podUID="22169096-dc0c-47ea-a40e-728cac38c1d4" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.460292 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-qsghf" podUID="b7ec4270-842e-49cb-8d22-16df7b212443" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.738227 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-sfv6d" podUID="4375d78c-761e-4691-9da9-89f56373ea76" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:34 crc kubenswrapper[4912]: I0318 14:26:34.738649 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-sfv6d" podUID="4375d78c-761e-4691-9da9-89f56373ea76" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.025431 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.033464 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.037415 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.037476 4912 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="47de1d4d2a4cc0449a49a7726dfac56128bf96a71c4b88cf7a3a311bfe14b370" exitCode=1 Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.037576 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"47de1d4d2a4cc0449a49a7726dfac56128bf96a71c4b88cf7a3a311bfe14b370"} Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.037614 4912 scope.go:117] "RemoveContainer" containerID="033c8d97bacf98c7ec2e36fad49fb41b161b92e3d0ea907012c00b4248974787" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.039516 4912 scope.go:117] "RemoveContainer" containerID="47de1d4d2a4cc0449a49a7726dfac56128bf96a71c4b88cf7a3a311bfe14b370" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.041316 4912 generic.go:334] "Generic (PLEG): container finished" podID="f695b268-a8b7-4b72-a37b-dd342d7d369a" containerID="f09b36bfaee9e8ed70baeddc827771198905bf7caeb7cc1756ffcf377854370e" exitCode=1 Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.041425 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" event={"ID":"f695b268-a8b7-4b72-a37b-dd342d7d369a","Type":"ContainerDied","Data":"f09b36bfaee9e8ed70baeddc827771198905bf7caeb7cc1756ffcf377854370e"} Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.042756 4912 scope.go:117] "RemoveContainer" containerID="f09b36bfaee9e8ed70baeddc827771198905bf7caeb7cc1756ffcf377854370e" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.059417 4912 generic.go:334] "Generic (PLEG): container finished" podID="e5f93e56-4ca9-413c-9954-f94f182b6606" containerID="ad09aabc54bd1c47bf6353a48b357e0a74f3340770f17640e3637c60b2a6f544" exitCode=1 Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.059524 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" event={"ID":"e5f93e56-4ca9-413c-9954-f94f182b6606","Type":"ContainerDied","Data":"ad09aabc54bd1c47bf6353a48b357e0a74f3340770f17640e3637c60b2a6f544"} Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.060944 4912 scope.go:117] "RemoveContainer" containerID="ad09aabc54bd1c47bf6353a48b357e0a74f3340770f17640e3637c60b2a6f544" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.071697 4912 generic.go:334] "Generic (PLEG): container finished" podID="718af076-f027-4594-8294-53ec36b84f3c" containerID="b9652941f65b158fd0ba0242c4dc87bb28a99ed545fbc984d01319fb5e050100" exitCode=0 Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.072476 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" event={"ID":"718af076-f027-4594-8294-53ec36b84f3c","Type":"ContainerDied","Data":"b9652941f65b158fd0ba0242c4dc87bb28a99ed545fbc984d01319fb5e050100"} Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.074324 4912 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ptbgq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.074393 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" podUID="389bca57-3d65-4ed4-8b0d-9c09c58ecf99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.138193 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.224337 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.224439 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.226500 4912 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.226670 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9951708d-b5a5-4dea-9cb5-a89c96f2a404" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.265201 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.265308 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.265411 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.267357 4912 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-84z2w container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.267452 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.267525 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.299055 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.299525 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.332377 4912 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.332474 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="78d0ba71-aecb-4e22-a459-c5f690268e0e" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.335392 4912 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.335454 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="78d0ba71-aecb-4e22-a459-c5f690268e0e" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.349090 4912 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.349162 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.397802 4912 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.397905 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="5c4fd206-4176-47ef-9cee-8be6e9ed396f" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.400353 4912 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.400458 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="5c4fd206-4176-47ef-9cee-8be6e9ed396f" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.465258 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" podUID="d96a656e-5436-4af3-b4cd-98c485c402a1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.465381 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.738495 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-pr4zx" podUID="b5944127-745d-42f9-83c2-d448435da4c9" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.740692 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-pr4zx" podUID="b5944127-745d-42f9-83c2-d448435da4c9" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.806368 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:42434->10.217.0.67:8443: read: connection reset by peer" start-of-body= Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.806458 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:42434->10.217.0.67:8443: read: connection reset by peer" Mar 18 14:26:35 crc kubenswrapper[4912]: I0318 14:26:35.820112 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.113664 4912 generic.go:334] "Generic (PLEG): container finished" podID="334b170e-0f84-42b2-81a6-8c469d187fa3" containerID="a4b88a2e6f74a393beebd9f0978cd7ab5f4796b4198f0930f82123a5c688f5ce" exitCode=1 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.113841 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" event={"ID":"334b170e-0f84-42b2-81a6-8c469d187fa3","Type":"ContainerDied","Data":"a4b88a2e6f74a393beebd9f0978cd7ab5f4796b4198f0930f82123a5c688f5ce"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.116953 4912 scope.go:117] "RemoveContainer" containerID="a4b88a2e6f74a393beebd9f0978cd7ab5f4796b4198f0930f82123a5c688f5ce" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.128330 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" event={"ID":"9ead324e-7891-4059-9d70-90462b2cc852","Type":"ContainerStarted","Data":"b0b14aa852947eab800af84fcc8d29bb17aca63bc67b6cd7e189e62eb6f7ce98"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.129486 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.137647 4912 generic.go:334] "Generic (PLEG): container finished" podID="45ef8022-adf2-46bc-a112-a5532880c080" containerID="a08dafbd9cbe4398bee406eeb92482a1177905b4efa74b2db5352cd6b973731c" exitCode=1 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.137710 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" event={"ID":"45ef8022-adf2-46bc-a112-a5532880c080","Type":"ContainerDied","Data":"a08dafbd9cbe4398bee406eeb92482a1177905b4efa74b2db5352cd6b973731c"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.138719 4912 scope.go:117] "RemoveContainer" containerID="a08dafbd9cbe4398bee406eeb92482a1177905b4efa74b2db5352cd6b973731c" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.146919 4912 generic.go:334] "Generic (PLEG): container finished" podID="9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4" containerID="563b7fc7c9639fb508ff660ec01f7ba421e215a122cde8fd4816b239b7c06b34" exitCode=1 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.147000 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" event={"ID":"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4","Type":"ContainerDied","Data":"563b7fc7c9639fb508ff660ec01f7ba421e215a122cde8fd4816b239b7c06b34"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.148286 4912 scope.go:117] "RemoveContainer" containerID="563b7fc7c9639fb508ff660ec01f7ba421e215a122cde8fd4816b239b7c06b34" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.150826 4912 generic.go:334] "Generic (PLEG): container finished" podID="7ffd183f-20a4-4586-ac75-597797ada23c" containerID="f2010f0f0b777b55c4bd3f94257ee1b2cb3f040b26a6fc492adf3db776615ea2" exitCode=1 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.150886 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" event={"ID":"7ffd183f-20a4-4586-ac75-597797ada23c","Type":"ContainerDied","Data":"f2010f0f0b777b55c4bd3f94257ee1b2cb3f040b26a6fc492adf3db776615ea2"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.152021 4912 scope.go:117] "RemoveContainer" containerID="f2010f0f0b777b55c4bd3f94257ee1b2cb3f040b26a6fc492adf3db776615ea2" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.156675 4912 generic.go:334] "Generic (PLEG): container finished" podID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerID="7dae4616235a902ccd3a026d9bf2db75cb2b0d8e0a9ceba0743173e72fddbb0e" exitCode=0 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.156735 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" event={"ID":"a137f3f5-d649-4e59-80a5-0aedb734a766","Type":"ContainerDied","Data":"7dae4616235a902ccd3a026d9bf2db75cb2b0d8e0a9ceba0743173e72fddbb0e"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.167213 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-bbvtw_7feb8268-723e-408b-b800-744481779d38/router/0.log" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.168340 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bbvtw" event={"ID":"7feb8268-723e-408b-b800-744481779d38","Type":"ContainerStarted","Data":"ecd672bb6be73d41aa375771a8c5d832e26264a79ac5d7a15ea8fedc74032156"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.176249 4912 generic.go:334] "Generic (PLEG): container finished" podID="5661de32-ecd9-4450-b757-465370105082" containerID="1a2abf407dbb7a4d45cd9bea2e08dcd690fb6c3f51bcc6ef05aeb8e10c01d4c3" exitCode=0 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.177135 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" event={"ID":"5661de32-ecd9-4450-b757-465370105082","Type":"ContainerDied","Data":"1a2abf407dbb7a4d45cd9bea2e08dcd690fb6c3f51bcc6ef05aeb8e10c01d4c3"} Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.177141 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"31c653dc504df4845b475e0d0527cac9baa2eab4a2325b160a5da64f7439c716"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.177218 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" podUID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerName="prometheus-operator-admission-webhook" containerID="cri-o://31c653dc504df4845b475e0d0527cac9baa2eab4a2325b160a5da64f7439c716" gracePeriod=30 Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.192136 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.353640 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-585bd669c7-vrxh8" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.369873 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" probeResult="failure" output="" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.440095 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.440291 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.440322 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 14:26:36 crc kubenswrapper[4912]: I0318 14:26:36.719628 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.018742 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:37 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:37 crc kubenswrapper[4912]: > Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.131731 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.132166 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.362492 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": dial tcp 10.217.0.104:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.362610 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.364268 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" podUID="13092522-58a7-4c49-9164-41523060735e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": dial tcp 10.217.0.104:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.408973 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" podUID="2faefcc2-b6a3-4dee-a077-af88038f3565" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": dial tcp 10.217.0.105:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.426675 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.448545 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.448613 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.454961 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": dial tcp 10.217.0.107:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.455121 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.456626 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" podUID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": dial tcp 10.217.0.107:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.459351 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" podUID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": dial tcp 10.217.0.106:8081: connect: connection refused" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.461192 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"58fd93dcde8655d5934e5025e83f0eb87453877890d7bc9c708dde9ed079abd8"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.461322 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"eb36b16a7c0e243fe904694742ec7e11a72b33aea2b15888a04c8540a42db45c"} pod="metallb-system/frr-k8s-ngzqk" containerMessage="Container frr failed liveness probe, will be restarted" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.461514 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-ngzqk" podUID="0475f7b9-387c-422d-88c8-90416895b720" containerName="frr" containerID="cri-o://eb36b16a7c0e243fe904694742ec7e11a72b33aea2b15888a04c8540a42db45c" gracePeriod=2 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.461336 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.472083 4912 generic.go:334] "Generic (PLEG): container finished" podID="1f17f2a1-55b9-493b-9a8a-3d53f21becb9" containerID="94e423b83257ecda2eecb455aa5efd7278789ae3f0c6971115e9ea22f3cf700c" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.472283 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" event={"ID":"1f17f2a1-55b9-493b-9a8a-3d53f21becb9","Type":"ContainerDied","Data":"94e423b83257ecda2eecb455aa5efd7278789ae3f0c6971115e9ea22f3cf700c"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.476402 4912 scope.go:117] "RemoveContainer" containerID="94e423b83257ecda2eecb455aa5efd7278789ae3f0c6971115e9ea22f3cf700c" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.476997 4912 generic.go:334] "Generic (PLEG): container finished" podID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerID="b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.477279 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tkt7x" event={"ID":"10c9b954-d1cb-4055-a082-5b06828b5faa","Type":"ContainerDied","Data":"b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.551637 4912 generic.go:334] "Generic (PLEG): container finished" podID="f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0" containerID="b70cb9746b9e2dce40680ff2d90f739d7ee651330834b5714502ea08dbc0f8aa" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.552096 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" event={"ID":"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0","Type":"ContainerDied","Data":"b70cb9746b9e2dce40680ff2d90f739d7ee651330834b5714502ea08dbc0f8aa"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.553549 4912 scope.go:117] "RemoveContainer" containerID="b70cb9746b9e2dce40680ff2d90f739d7ee651330834b5714502ea08dbc0f8aa" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.559791 4912 generic.go:334] "Generic (PLEG): container finished" podID="3821e364-991e-4a58-88e6-cf499d12aa70" containerID="d89a66518ba4c47e2bf682f15b4d61dc0c7f7e9e25a8fd2adf64629b8dd829af" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.559891 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" event={"ID":"3821e364-991e-4a58-88e6-cf499d12aa70","Type":"ContainerDied","Data":"d89a66518ba4c47e2bf682f15b4d61dc0c7f7e9e25a8fd2adf64629b8dd829af"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.560906 4912 scope.go:117] "RemoveContainer" containerID="d89a66518ba4c47e2bf682f15b4d61dc0c7f7e9e25a8fd2adf64629b8dd829af" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.584539 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.612950 4912 generic.go:334] "Generic (PLEG): container finished" podID="98fed63c-9006-4589-a119-1e25fb115041" containerID="a1418d2c6a7c6ce8c89ee270bd44091f9a4eb4291e4d378d52ca09e5e39fa140" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.613116 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" event={"ID":"98fed63c-9006-4589-a119-1e25fb115041","Type":"ContainerDied","Data":"a1418d2c6a7c6ce8c89ee270bd44091f9a4eb4291e4d378d52ca09e5e39fa140"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.613447 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.628312 4912 scope.go:117] "RemoveContainer" containerID="a1418d2c6a7c6ce8c89ee270bd44091f9a4eb4291e4d378d52ca09e5e39fa140" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.643095 4912 generic.go:334] "Generic (PLEG): container finished" podID="2faefcc2-b6a3-4dee-a077-af88038f3565" containerID="8829fefdc365d80d71ceeb78a39c2717f8e8a83b1808640b31d91a6c19c069f7" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.643195 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" event={"ID":"2faefcc2-b6a3-4dee-a077-af88038f3565","Type":"ContainerDied","Data":"8829fefdc365d80d71ceeb78a39c2717f8e8a83b1808640b31d91a6c19c069f7"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.644169 4912 scope.go:117] "RemoveContainer" containerID="8829fefdc365d80d71ceeb78a39c2717f8e8a83b1808640b31d91a6c19c069f7" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.668111 4912 generic.go:334] "Generic (PLEG): container finished" podID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerID="4317cf97fdfbf35b04a793d8a339ef489d493e75a32877049563389b71ba16e5" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.668203 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" event={"ID":"e6df54ff-ee21-4b6b-bab8-86839f9a035c","Type":"ContainerDied","Data":"4317cf97fdfbf35b04a793d8a339ef489d493e75a32877049563389b71ba16e5"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.678009 4912 generic.go:334] "Generic (PLEG): container finished" podID="692fb335-57d8-465c-b7ef-d94c53f84523" containerID="a2838d2a3358ca2a78c27e005b22cc589be8c26b55d94ee076c37a6fd1abc08d" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.678247 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" event={"ID":"692fb335-57d8-465c-b7ef-d94c53f84523","Type":"ContainerDied","Data":"a2838d2a3358ca2a78c27e005b22cc589be8c26b55d94ee076c37a6fd1abc08d"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.691428 4912 generic.go:334] "Generic (PLEG): container finished" podID="67ab4d42-cf77-45ce-9bf7-f0db056c4151" containerID="25aa0d27b8703286979ccc95a6d60caa8a185ec8897641b6a34d16071bb0623f" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.691589 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" event={"ID":"67ab4d42-cf77-45ce-9bf7-f0db056c4151","Type":"ContainerDied","Data":"25aa0d27b8703286979ccc95a6d60caa8a185ec8897641b6a34d16071bb0623f"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.699080 4912 scope.go:117] "RemoveContainer" containerID="a2838d2a3358ca2a78c27e005b22cc589be8c26b55d94ee076c37a6fd1abc08d" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.699815 4912 scope.go:117] "RemoveContainer" containerID="25aa0d27b8703286979ccc95a6d60caa8a185ec8897641b6a34d16071bb0623f" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.711849 4912 generic.go:334] "Generic (PLEG): container finished" podID="e7b90186-2a06-42a0-aec9-8d8f27dfe4dd" containerID="e5e4d7773dc1bbce8c3eb432d06c7b2d1b8e34356cf450d0f3924d04a6958ae8" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.711940 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" event={"ID":"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd","Type":"ContainerDied","Data":"e5e4d7773dc1bbce8c3eb432d06c7b2d1b8e34356cf450d0f3924d04a6958ae8"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.712747 4912 scope.go:117] "RemoveContainer" containerID="e5e4d7773dc1bbce8c3eb432d06c7b2d1b8e34356cf450d0f3924d04a6958ae8" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.742672 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.757327 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.765206 4912 generic.go:334] "Generic (PLEG): container finished" podID="b210beca-0aed-404b-9af5-b704345ce2f8" containerID="70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.765308 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" event={"ID":"b210beca-0aed-404b-9af5-b704345ce2f8","Type":"ContainerDied","Data":"70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.810730 4912 generic.go:334] "Generic (PLEG): container finished" podID="eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b" containerID="1e366ffa0c569c413306141c62fa70428165be06b91a7f09bb4263f76a862376" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.810854 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerDied","Data":"1e366ffa0c569c413306141c62fa70428165be06b91a7f09bb4263f76a862376"} Mar 18 14:26:37 crc kubenswrapper[4912]: E0318 14:26:37.822080 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.829681 4912 generic.go:334] "Generic (PLEG): container finished" podID="13092522-58a7-4c49-9164-41523060735e" containerID="37b49390da442ae7479fee7189cc6177ae463fef97b4a273c2ef012771885ab2" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.829856 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" event={"ID":"13092522-58a7-4c49-9164-41523060735e","Type":"ContainerDied","Data":"37b49390da442ae7479fee7189cc6177ae463fef97b4a273c2ef012771885ab2"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.834372 4912 scope.go:117] "RemoveContainer" containerID="37b49390da442ae7479fee7189cc6177ae463fef97b4a273c2ef012771885ab2" Mar 18 14:26:37 crc kubenswrapper[4912]: E0318 14:26:37.849669 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.856109 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.856936 4912 generic.go:334] "Generic (PLEG): container finished" podID="59c7d762-a4b8-452d-8824-572aa03c40fd" containerID="31c653dc504df4845b475e0d0527cac9baa2eab4a2325b160a5da64f7439c716" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.857255 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" event={"ID":"59c7d762-a4b8-452d-8824-572aa03c40fd","Type":"ContainerDied","Data":"31c653dc504df4845b475e0d0527cac9baa2eab4a2325b160a5da64f7439c716"} Mar 18 14:26:37 crc kubenswrapper[4912]: E0318 14:26:37.868918 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:37 crc kubenswrapper[4912]: E0318 14:26:37.868976 4912 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.911829 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-vqtmj" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.912077 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57c55bf5f4-gflkq" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.928432 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.930494 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.933255 4912 generic.go:334] "Generic (PLEG): container finished" podID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerID="a3882757bbf19032efcdfc98738b4713ab8eb41ae9cfa9af6fc0a0163e543a3b" exitCode=0 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.933318 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" event={"ID":"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e","Type":"ContainerDied","Data":"a3882757bbf19032efcdfc98738b4713ab8eb41ae9cfa9af6fc0a0163e543a3b"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.941598 4912 generic.go:334] "Generic (PLEG): container finished" podID="6ff20347-b4ef-4d01-966c-5ba69dcf546c" containerID="9881ef1d6f722210a00ffd35d8692d65cbd085c6292c551ddd164ef4a9e9efde" exitCode=1 Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.943138 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" event={"ID":"6ff20347-b4ef-4d01-966c-5ba69dcf546c","Type":"ContainerDied","Data":"9881ef1d6f722210a00ffd35d8692d65cbd085c6292c551ddd164ef4a9e9efde"} Mar 18 14:26:37 crc kubenswrapper[4912]: I0318 14:26:37.944417 4912 scope.go:117] "RemoveContainer" containerID="9881ef1d6f722210a00ffd35d8692d65cbd085c6292c551ddd164ef4a9e9efde" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.001740 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.037076 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.037224 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.037903 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" podUID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.049500 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.092445 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.320664 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 14:26:38 crc kubenswrapper[4912]: E0318 14:26:38.362987 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00 is running failed: container process not found" containerID="b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:38 crc kubenswrapper[4912]: E0318 14:26:38.364148 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00 is running failed: container process not found" containerID="b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:38 crc kubenswrapper[4912]: E0318 14:26:38.377260 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00 is running failed: container process not found" containerID="b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 14:26:38 crc kubenswrapper[4912]: E0318 14:26:38.377708 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b4c1a85d0e474e11c6bb1c091440e2c7ced2802f53e6b199cd9e383b7ec31c00 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tkt7x" podUID="10c9b954-d1cb-4055-a082-5b06828b5faa" containerName="registry-server" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.417527 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" containerID="cri-o://2c12695758a8e030ab1bbae40231e9e1240ab28ee6b2233948263853940f7e9f" gracePeriod=21 Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.440821 4912 patch_prober.go:28] interesting pod/router-default-5444994796-bbvtw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.440873 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bbvtw" podUID="7feb8268-723e-408b-b800-744481779d38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.623651 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.684417 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" containerID="cri-o://0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904" gracePeriod=20 Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.688168 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-54d55b7b75-h9lqp" Mar 18 14:26:38 crc kubenswrapper[4912]: I0318 14:26:38.785769 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-vgvxb" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.017105 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7bb9f46ccc-95zvh" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.088965 4912 generic.go:334] "Generic (PLEG): container finished" podID="6afa3dcd-776b-4472-9e54-31e102d2fb67" containerID="984933cc67078f32c68a626439599558baa5f5b7414e6b677c6236ee51856a37" exitCode=1 Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.089159 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" event={"ID":"6afa3dcd-776b-4472-9e54-31e102d2fb67","Type":"ContainerDied","Data":"984933cc67078f32c68a626439599558baa5f5b7414e6b677c6236ee51856a37"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.090516 4912 scope.go:117] "RemoveContainer" containerID="984933cc67078f32c68a626439599558baa5f5b7414e6b677c6236ee51856a37" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.117383 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" event={"ID":"e5f93e56-4ca9-413c-9954-f94f182b6606","Type":"ContainerStarted","Data":"b664f22ced09c648d6b0d08ba6fccf74aa9c70e059cbe14af7e1d8dec913bff5"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.119681 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.158686 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" event={"ID":"8efdcb68-92df-434c-8446-5be1ef0a94ba","Type":"ContainerStarted","Data":"3254ab31ea20efe0318d11dc264fa716b96b01ed0fde5059e4029190e9ce7dc6"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.187845 4912 generic.go:334] "Generic (PLEG): container finished" podID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerID="192e1c96eff03e0f62710194722b73674fcb0688acd78aa035037f1792a605b1" exitCode=0 Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.187974 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" event={"ID":"5f2f03ae-9287-4840-bcda-91d0b68849d7","Type":"ContainerDied","Data":"192e1c96eff03e0f62710194722b73674fcb0688acd78aa035037f1792a605b1"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.192096 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-lmtc2"] Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.196759 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" event={"ID":"718af076-f027-4594-8294-53ec36b84f3c","Type":"ContainerStarted","Data":"2f35b6b36e16f2fe39212a5b2283b0f1a2c3ae16cba278175446d3ba5e0dcb22"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.198449 4912 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c4jkh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.198603 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" podUID="718af076-f027-4594-8294-53ec36b84f3c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.198448 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.209672 4912 generic.go:334] "Generic (PLEG): container finished" podID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerID="09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b" exitCode=0 Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.209751 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerDied","Data":"09051bff5db5e2f41daebfe0df4aa704b89723a5cbbcfd1d90c24e0d3b45343b"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.249813 4912 generic.go:334] "Generic (PLEG): container finished" podID="0475f7b9-387c-422d-88c8-90416895b720" containerID="eb36b16a7c0e243fe904694742ec7e11a72b33aea2b15888a04c8540a42db45c" exitCode=143 Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.250961 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerDied","Data":"eb36b16a7c0e243fe904694742ec7e11a72b33aea2b15888a04c8540a42db45c"} Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.451730 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.584958 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.847481 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" podUID="7d7516e2-d2c4-4f18-9cc6-d2aad94db27e" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.91:7572/metrics\": dial tcp 10.217.0.91:7572: connect: connection refused" Mar 18 14:26:39 crc kubenswrapper[4912]: I0318 14:26:39.901363 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.136218 4912 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-tqk5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.136279 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" podUID="b210beca-0aed-404b-9af5-b704345ce2f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.136428 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jkd5w" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.209064 4912 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mjtnc container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.209519 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" podUID="5f2f03ae-9287-4840-bcda-91d0b68849d7" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.315049 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c4jkh" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.315099 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ptbgq" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.384627 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" event={"ID":"9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4","Type":"ContainerStarted","Data":"0d6a1ef2b5b24e3aa33d9f7851d9a27a77f9f62f0201ba5360533ce6111ace3d"} Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.384833 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.390030 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" event={"ID":"54c16250-e6b3-4308-bf15-d4633c661d9e","Type":"ContainerStarted","Data":"0e73ae46ed3751648a9ea25b95e629330b60ff0dd1f06fc0185023dd7ac04eba"} Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.390533 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.406082 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-bbvtw" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.526401 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tjwtn" Mar 18 14:26:40 crc kubenswrapper[4912]: E0318 14:26:40.583844 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.798922 4912 trace.go:236] Trace[111770570]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (18-Mar-2026 14:26:38.546) (total time: 2248ms): Mar 18 14:26:40 crc kubenswrapper[4912]: Trace[111770570]: [2.248952334s] [2.248952334s] END Mar 18 14:26:40 crc kubenswrapper[4912]: I0318 14:26:40.860887 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zw7wz" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.021549 4912 trace.go:236] Trace[43766132]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (18-Mar-2026 14:26:39.950) (total time: 1071ms): Mar 18 14:26:41 crc kubenswrapper[4912]: Trace[43766132]: [1.071395101s] [1.071395101s] END Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.408320 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.411959 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.412183 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"feadda7752d665253fe0144b7b84390ad4ee29aac7be72b73f1dc3bb0905c4cc"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.417653 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" event={"ID":"f695b268-a8b7-4b72-a37b-dd342d7d369a","Type":"ContainerStarted","Data":"e37a0b957ba92f50b308497da8bb60f05d57918b86dd3ad2dcb884b9d95afded"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.417816 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.427003 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" event={"ID":"7ffd183f-20a4-4586-ac75-597797ada23c","Type":"ContainerStarted","Data":"88db23aa71c748fd4ba74b4653a9a19729470bf4ce7bd5236f56c65fca856bac"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.428625 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.440245 4912 generic.go:334] "Generic (PLEG): container finished" podID="77736799-2ebe-4076-9717-6741aed93599" containerID="2c12695758a8e030ab1bbae40231e9e1240ab28ee6b2233948263853940f7e9f" exitCode=0 Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.440345 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerDied","Data":"2c12695758a8e030ab1bbae40231e9e1240ab28ee6b2233948263853940f7e9f"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.456752 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sxhvr" event={"ID":"5661de32-ecd9-4450-b757-465370105082","Type":"ContainerStarted","Data":"6c085c035626a1f0f8f819871c68315ffbfb4448ff0181d7ea4654a07932a806"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.464534 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" event={"ID":"334b170e-0f84-42b2-81a6-8c469d187fa3","Type":"ContainerStarted","Data":"b268e9a4211c7dcda5daa6e0f71ec4c0b84e83d3217b477aaf0974137d49b4b7"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.464696 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.482914 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" event={"ID":"45ef8022-adf2-46bc-a112-a5532880c080","Type":"ContainerStarted","Data":"33870022ea61e3983262b0dd65131906b734740c425828ce462a9e462720a923"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.483576 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.506275 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" event={"ID":"a137f3f5-d649-4e59-80a5-0aedb734a766","Type":"ContainerStarted","Data":"e95fba8f0b4865fd345cf02d741ee9f7d2580628486a12a9efbfc6554a9d44d9"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.508745 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.557428 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" event={"ID":"b210beca-0aed-404b-9af5-b704345ce2f8","Type":"ContainerStarted","Data":"18d98d0bbb4d1fd38e35e8349ce25dbad2114685df2eff31a7d76038bfadaf6b"} Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.563446 4912 patch_prober.go:28] interesting pod/controller-manager-758896dd6-55gnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.563558 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" podUID="a137f3f5-d649-4e59-80a5-0aedb734a766" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.609911 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:26:41 crc kubenswrapper[4912]: I0318 14:26:41.966631 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-s2ztv" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.194060 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-jgbgh" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.275845 4912 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" containerID="cri-o://70ce4157388f50d78ac10e04e377efb6d37df965da999fee14c6b0b9bc0f0098" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.275885 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.275935 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-tcbf4" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.424578 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.585371 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" event={"ID":"7d7516e2-d2c4-4f18-9cc6-d2aad94db27e","Type":"ContainerStarted","Data":"b80ff99531a3073f0c55db89698f6549e722a256f76ee451b7d728b5da504c61"} Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.585702 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.593917 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" event={"ID":"59c7d762-a4b8-452d-8824-572aa03c40fd","Type":"ContainerStarted","Data":"099e36a4997f07e7fb72b4c8085b0806b0492df309bf6f937d8a8cd517fe5ab4"} Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.594963 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.600417 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-758896dd6-55gnf" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.823599 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:26:42 crc kubenswrapper[4912]: I0318 14:26:42.830522 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.138460 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tqk5x" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.201847 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.339579 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.363539 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.433173 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.696441 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" event={"ID":"5f2f03ae-9287-4840-bcda-91d0b68849d7","Type":"ContainerStarted","Data":"6e73f2d93e4217bf8884affd96068ad494cc06462e8dbd4c962905764ff80659"} Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.698901 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 14:26:43 crc kubenswrapper[4912]: E0318 14:26:43.703747 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.760257 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ngzqk" event={"ID":"0475f7b9-387c-422d-88c8-90416895b720","Type":"ContainerStarted","Data":"db686e16a1c4e33206abb4ca4598ece27372d6be6e128548c9e0460ea34d1eff"} Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.784179 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" event={"ID":"3821e364-991e-4a58-88e6-cf499d12aa70","Type":"ContainerStarted","Data":"d7585f4447f2f145787f0c9cec6ad691d0dd528a8a9532f3cdb2d78fb98c1613"} Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.785275 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.805224 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" event={"ID":"98fed63c-9006-4589-a119-1e25fb115041","Type":"ContainerStarted","Data":"e9e7b13866bc15af49bd6e303f3bfdc7cc668a996049525ec358b510e16dced4"} Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.806146 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.806184 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.806199 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.813322 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-84z2w" Mar 18 14:26:43 crc kubenswrapper[4912]: I0318 14:26:43.905956 4912 trace.go:236] Trace[370849018]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (18-Mar-2026 14:26:42.595) (total time: 1310ms): Mar 18 14:26:43 crc kubenswrapper[4912]: Trace[370849018]: [1.310442471s] [1.310442471s] END Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.267085 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.335843 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.859603 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" event={"ID":"2faefcc2-b6a3-4dee-a077-af88038f3565","Type":"ContainerStarted","Data":"731b55ff5efbbc1d764a87e5b40db52795d5f9622286a72ef7a254cd2661fc8e"} Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.861610 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.884499 4912 generic.go:334] "Generic (PLEG): container finished" podID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerID="0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904" exitCode=0 Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.884715 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerDied","Data":"0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904"} Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.940472 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98dk7" event={"ID":"73cfec7d-c7e6-4beb-9a85-f161c2c7c31a","Type":"ContainerStarted","Data":"201b4986b8943b9d0744e49e0ba73460ce0fe4dc3f9a1d88d6312ab9a5780550"} Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.975014 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" event={"ID":"6afa3dcd-776b-4472-9e54-31e102d2fb67","Type":"ContainerStarted","Data":"2504aafdda48bb1e577f309c521529d969e83cb17b9ee096ad95d00c9ff23c8d"} Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.976655 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 14:26:44 crc kubenswrapper[4912]: I0318 14:26:44.999599 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" event={"ID":"f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0","Type":"ContainerStarted","Data":"f4079664e56171926de1be8069e5f5f0d032791ecbed15171d8bcc5686f03859"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.001255 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.009754 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" event={"ID":"6ff20347-b4ef-4d01-966c-5ba69dcf546c","Type":"ContainerStarted","Data":"f22d3e39db8af090b1b08cefe0bb90d9bd9fc548be26c5ccd3e4a36558a2de9d"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.011531 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.037412 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tkt7x" event={"ID":"10c9b954-d1cb-4055-a082-5b06828b5faa","Type":"ContainerStarted","Data":"a54beaac5322e91b87de2ce65ee582712f795dd0762ec7c3ae97e3838c06d9e2"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.077873 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b","Type":"ContainerStarted","Data":"4b7b46406c6f03d59b5c654a1c29999e9cc9205bf83af5c6d2840ff294c0f488"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.108891 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77736799-2ebe-4076-9717-6741aed93599","Type":"ContainerStarted","Data":"8b227674dd0432bad3ed314f97fcb7824cdaa0a6dafaf8654bbb755a382e6714"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.123652 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" event={"ID":"54c16250-e6b3-4308-bf15-d4633c661d9e","Type":"ContainerStarted","Data":"bc60b4e43daef71b12850509b354819220cb4d76af8af7327ab323aa87475de8"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.140101 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" event={"ID":"1f17f2a1-55b9-493b-9a8a-3d53f21becb9","Type":"ContainerStarted","Data":"84150ae5d8375b034424a02a9d47c6a3abe74a04ee388eb27f0f13079bdd05b2"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.140442 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.162612 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" event={"ID":"e7b90186-2a06-42a0-aec9-8d8f27dfe4dd","Type":"ContainerStarted","Data":"98e760cdf7c03a9b68e14527a05bc9fa77462201257ced09ff82c8399af7657f"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.164137 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.175940 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" event={"ID":"692fb335-57d8-465c-b7ef-d94c53f84523","Type":"ContainerStarted","Data":"7dce9e3dfa902870f97381bf14fe55478e71981af297cf5c1f835db328bef8cd"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.177768 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.178950 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-867987c6b7-jg2ct" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.197473 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" event={"ID":"67ab4d42-cf77-45ce-9bf7-f0db056c4151","Type":"ContainerStarted","Data":"98d9f2f4ddaec9fbddfd6b231e163c1dcacd9019834fc5c7c3bdaf8b6004de61"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.199979 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.209665 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" event={"ID":"e6df54ff-ee21-4b6b-bab8-86839f9a035c","Type":"ContainerStarted","Data":"a6507a67a485315090241ef6c1048dd8ecff7fc5f39c3a30a54b16fd947b972d"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.210399 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.210471 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.210505 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.225562 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" event={"ID":"13092522-58a7-4c49-9164-41523060735e","Type":"ContainerStarted","Data":"2640a0da300513d7c743d1bbaef8c07325d7b192303faebe0e07cfe0da67a47a"} Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.226435 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.299466 4912 patch_prober.go:28] interesting pod/route-controller-manager-76df45d45-cmf7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.299555 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" podUID="e6df54ff-ee21-4b6b-bab8-86839f9a035c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.454357 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" podStartSLOduration=43.478245009 podStartE2EDuration="44.452822799s" podCreationTimestamp="2026-03-18 14:26:01 +0000 UTC" firstStartedPulling="2026-03-18 14:26:40.428513712 +0000 UTC m=+5048.887941137" lastFinishedPulling="2026-03-18 14:26:41.403091502 +0000 UTC m=+5049.862518927" observedRunningTime="2026-03-18 14:26:45.342309618 +0000 UTC m=+5053.801737053" watchObservedRunningTime="2026-03-18 14:26:45.452822799 +0000 UTC m=+5053.912250224" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.822112 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.822210 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.823223 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"7ee1cc1a6d2764eff6597a2ce8080b1c63edb6d412c882b12e17490f7db146c1"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 18 14:26:45 crc kubenswrapper[4912]: I0318 14:26:45.823281 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" containerID="cri-o://7ee1cc1a6d2764eff6597a2ce8080b1c63edb6d412c882b12e17490f7db146c1" gracePeriod=30 Mar 18 14:26:46 crc kubenswrapper[4912]: E0318 14:26:46.077526 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904 is running failed: container process not found" containerID="0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 14:26:46 crc kubenswrapper[4912]: E0318 14:26:46.078990 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904 is running failed: container process not found" containerID="0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 14:26:46 crc kubenswrapper[4912]: E0318 14:26:46.079574 4912 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904 is running failed: container process not found" containerID="0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 14:26:46 crc kubenswrapper[4912]: E0318 14:26:46.079674 4912 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0748ad7856751cef464d71a6d7501190fe2f3ac613a87254b423eba198ec6904 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d0973556-9c2c-4037-b800-d11ecf1904cc" containerName="galera" Mar 18 14:26:46 crc kubenswrapper[4912]: I0318 14:26:46.243393 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0973556-9c2c-4037-b800-d11ecf1904cc","Type":"ContainerStarted","Data":"6662bcf2978a5418b21ee358b1096f1713582dff28582d8e2163c9750437955a"} Mar 18 14:26:46 crc kubenswrapper[4912]: I0318 14:26:46.267347 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ngzqk" Mar 18 14:26:46 crc kubenswrapper[4912]: I0318 14:26:46.267429 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76df45d45-cmf7b" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.268210 4912 generic.go:334] "Generic (PLEG): container finished" podID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerID="7ee1cc1a6d2764eff6597a2ce8080b1c63edb6d412c882b12e17490f7db146c1" exitCode=0 Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.270006 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc","Type":"ContainerDied","Data":"7ee1cc1a6d2764eff6597a2ce8080b1c63edb6d412c882b12e17490f7db146c1"} Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.368936 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.371216 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.618806 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-wljg2" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.785801 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.786277 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:26:47 crc kubenswrapper[4912]: I0318 14:26:47.858840 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-6ksxg" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.003734 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-zp69w" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.132166 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-f95vk" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.288965 4912 generic.go:334] "Generic (PLEG): container finished" podID="54c16250-e6b3-4308-bf15-d4633c661d9e" containerID="bc60b4e43daef71b12850509b354819220cb4d76af8af7327ab323aa87475de8" exitCode=0 Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.289221 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" event={"ID":"54c16250-e6b3-4308-bf15-d4633c661d9e","Type":"ContainerDied","Data":"bc60b4e43daef71b12850509b354819220cb4d76af8af7327ab323aa87475de8"} Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.296938 4912 generic.go:334] "Generic (PLEG): container finished" podID="fab7b705-5ef2-46e6-851d-5c38d246ee55" containerID="43990ccac5d94e76aadc376be4e39c15f7e1069f37d2b8aa71d5d9ff32e9f697" exitCode=1 Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.297028 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fab7b705-5ef2-46e6-851d-5c38d246ee55","Type":"ContainerDied","Data":"43990ccac5d94e76aadc376be4e39c15f7e1069f37d2b8aa71d5d9ff32e9f697"} Mar 18 14:26:48 crc kubenswrapper[4912]: E0318 14:26:48.316108 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab7b705_5ef2_46e6_851d_5c38d246ee55.slice/crio-43990ccac5d94e76aadc376be4e39c15f7e1069f37d2b8aa71d5d9ff32e9f697.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:48 crc kubenswrapper[4912]: E0318 14:26:48.316254 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab7b705_5ef2_46e6_851d_5c38d246ee55.slice/crio-43990ccac5d94e76aadc376be4e39c15f7e1069f37d2b8aa71d5d9ff32e9f697.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.323994 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-52z7q" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.350181 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.350641 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.422134 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.515580 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54bbf46695-l6jq5" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.628536 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-drnxt" Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.857861 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:48 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:48 crc kubenswrapper[4912]: > Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.965972 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:26:48 crc kubenswrapper[4912]: I0318 14:26:48.970368 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.011849 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.048949 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.049140 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8h6\" (UniqueName: \"kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.049964 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.152899 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.153695 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.154607 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.155155 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.156218 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8h6\" (UniqueName: \"kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.195947 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8h6\" (UniqueName: \"kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6\") pod \"redhat-operators-2tbxp\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.305701 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.335311 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc","Type":"ContainerStarted","Data":"37df2eea370458ef9260a5ab9e115e94ef083742e9dbd178a0da2e4608935252"} Mar 18 14:26:49 crc kubenswrapper[4912]: I0318 14:26:49.405561 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tkt7x" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.595230 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:26:50 crc kubenswrapper[4912]: W0318 14:26:50.605117 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee256d64_8bad_4105_8525_88fca8e28757.slice/crio-d0bec1fa32a51d9e8cd5bd4d92f5d12afdc8103e0ff5cd4aa96e342966c094a8 WatchSource:0}: Error finding container d0bec1fa32a51d9e8cd5bd4d92f5d12afdc8103e0ff5cd4aa96e342966c094a8: Status 404 returned error can't find the container with id d0bec1fa32a51d9e8cd5bd4d92f5d12afdc8103e0ff5cd4aa96e342966c094a8 Mar 18 14:26:50 crc kubenswrapper[4912]: E0318 14:26:50.669415 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.794300 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.802169 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956303 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956399 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956490 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956525 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956608 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956657 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956720 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956803 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98pc2\" (UniqueName: \"kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956877 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9l8b\" (UniqueName: \"kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b\") pod \"54c16250-e6b3-4308-bf15-d4633c661d9e\" (UID: \"54c16250-e6b3-4308-bf15-d4633c661d9e\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.956922 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret\") pod \"fab7b705-5ef2-46e6-851d-5c38d246ee55\" (UID: \"fab7b705-5ef2-46e6-851d-5c38d246ee55\") " Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.962062 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data" (OuterVolumeSpecName: "config-data") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.963337 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.964405 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.972105 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2" (OuterVolumeSpecName: "kube-api-access-98pc2") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "kube-api-access-98pc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4912]: I0318 14:26:50.974510 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.004133 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b" (OuterVolumeSpecName: "kube-api-access-l9l8b") pod "54c16250-e6b3-4308-bf15-d4633c661d9e" (UID: "54c16250-e6b3-4308-bf15-d4633c661d9e"). InnerVolumeSpecName "kube-api-access-l9l8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.010827 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.013123 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.032943 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062245 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fab7b705-5ef2-46e6-851d-5c38d246ee55" (UID: "fab7b705-5ef2-46e6-851d-5c38d246ee55"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062911 4912 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062959 4912 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062972 4912 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062985 4912 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.062997 4912 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.063007 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98pc2\" (UniqueName: \"kubernetes.io/projected/fab7b705-5ef2-46e6-851d-5c38d246ee55-kube-api-access-98pc2\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.063017 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9l8b\" (UniqueName: \"kubernetes.io/projected/54c16250-e6b3-4308-bf15-d4633c661d9e-kube-api-access-l9l8b\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.063027 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.063069 4912 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fab7b705-5ef2-46e6-851d-5c38d246ee55-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.094824 4912 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.165287 4912 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab7b705-5ef2-46e6-851d-5c38d246ee55-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.165326 4912 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.385458 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.386397 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fab7b705-5ef2-46e6-851d-5c38d246ee55","Type":"ContainerDied","Data":"c86be2adf5d4ca4a37f2d5b6401938b1e0a33cc393e3671fcf1863de309b6eb9"} Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.386473 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86be2adf5d4ca4a37f2d5b6401938b1e0a33cc393e3671fcf1863de309b6eb9" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.398019 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" event={"ID":"54c16250-e6b3-4308-bf15-d4633c661d9e","Type":"ContainerDied","Data":"0e73ae46ed3751648a9ea25b95e629330b60ff0dd1f06fc0185023dd7ac04eba"} Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.398086 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e73ae46ed3751648a9ea25b95e629330b60ff0dd1f06fc0185023dd7ac04eba" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.398092 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-lmtc2" Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.402008 4912 generic.go:334] "Generic (PLEG): container finished" podID="ee256d64-8bad-4105-8525-88fca8e28757" containerID="517ddabc9010b1457053fd0adee77afea24872b67feb00ed0ffd7cd128c4017e" exitCode=0 Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.402424 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerDied","Data":"517ddabc9010b1457053fd0adee77afea24872b67feb00ed0ffd7cd128c4017e"} Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.402489 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerStarted","Data":"d0bec1fa32a51d9e8cd5bd4d92f5d12afdc8103e0ff5cd4aa96e342966c094a8"} Mar 18 14:26:51 crc kubenswrapper[4912]: I0318 14:26:51.730661 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 14:26:52 crc kubenswrapper[4912]: I0318 14:26:52.429887 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerStarted","Data":"bcfbb6d2cb3e3caad351a1131b02c5e5b64f45536d4b390d5cd5d930802b3106"} Mar 18 14:26:53 crc kubenswrapper[4912]: I0318 14:26:53.941580 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s" Mar 18 14:26:55 crc kubenswrapper[4912]: I0318 14:26:55.825140 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:26:56 crc kubenswrapper[4912]: I0318 14:26:56.077763 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 14:26:56 crc kubenswrapper[4912]: I0318 14:26:56.078190 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 14:26:56 crc kubenswrapper[4912]: I0318 14:26:56.700929 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" containerID="cri-o://2d3916fa1a525ce458866b8ee169fe7859782f6da0b61c3d51454e00f476b357" gracePeriod=14 Mar 18 14:26:56 crc kubenswrapper[4912]: I0318 14:26:56.756112 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:26:56 crc kubenswrapper[4912]: I0318 14:26:56.895777 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c665c8f96-4wcs8" podUID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerName="console" containerID="cri-o://902fe890ebba0eba046de3dd16b6f2a5f9b183abc29577c612d48926a743e605" gracePeriod=15 Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.371476 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-5wn69" Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.405795 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-pcz7q" Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.456352 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-glhjp" Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.467817 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-65r9q" Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.523439 4912 generic.go:334] "Generic (PLEG): container finished" podID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerID="2d3916fa1a525ce458866b8ee169fe7859782f6da0b61c3d51454e00f476b357" exitCode=0 Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.523537 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" event={"ID":"7aae5da4-fdd1-4295-bfed-a10638501acf","Type":"ContainerDied","Data":"2d3916fa1a525ce458866b8ee169fe7859782f6da0b61c3d51454e00f476b357"} Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.528567 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c665c8f96-4wcs8_0050439a-3a22-49ef-8b64-4fb98592d68b/console/0.log" Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.528663 4912 generic.go:334] "Generic (PLEG): container finished" podID="0050439a-3a22-49ef-8b64-4fb98592d68b" containerID="902fe890ebba0eba046de3dd16b6f2a5f9b183abc29577c612d48926a743e605" exitCode=2 Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.528709 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c665c8f96-4wcs8" event={"ID":"0050439a-3a22-49ef-8b64-4fb98592d68b","Type":"ContainerDied","Data":"902fe890ebba0eba046de3dd16b6f2a5f9b183abc29577c612d48926a743e605"} Mar 18 14:26:57 crc kubenswrapper[4912]: I0318 14:26:57.761320 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-72xxs" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.040634 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-l9d25" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.093615 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-wm76g" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.331581 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:26:58 crc kubenswrapper[4912]: E0318 14:26:58.332433 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c16250-e6b3-4308-bf15-d4633c661d9e" containerName="oc" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.332463 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c16250-e6b3-4308-bf15-d4633c661d9e" containerName="oc" Mar 18 14:26:58 crc kubenswrapper[4912]: E0318 14:26:58.332491 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab7b705-5ef2-46e6-851d-5c38d246ee55" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.332499 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab7b705-5ef2-46e6-851d-5c38d246ee55" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.332748 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab7b705-5ef2-46e6-851d-5c38d246ee55" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.332785 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c16250-e6b3-4308-bf15-d4633c661d9e" containerName="oc" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.333904 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.342687 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6qzpq" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.357377 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.415031 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45qd\" (UniqueName: \"kubernetes.io/projected/7eed3616-3121-46a7-9781-b245e6694fb9-kube-api-access-g45qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.415184 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.465885 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-9kt49" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.518072 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45qd\" (UniqueName: \"kubernetes.io/projected/7eed3616-3121-46a7-9781-b245e6694fb9-kube-api-access-g45qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.518247 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.518926 4912 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.574183 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45qd\" (UniqueName: \"kubernetes.io/projected/7eed3616-3121-46a7-9781-b245e6694fb9-kube-api-access-g45qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.618219 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7eed3616-3121-46a7-9781-b245e6694fb9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.635337 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c665c8f96-4wcs8_0050439a-3a22-49ef-8b64-4fb98592d68b/console/0.log" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.668521 4912 generic.go:334] "Generic (PLEG): container finished" podID="ee256d64-8bad-4105-8525-88fca8e28757" containerID="bcfbb6d2cb3e3caad351a1131b02c5e5b64f45536d4b390d5cd5d930802b3106" exitCode=0 Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.668584 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerDied","Data":"bcfbb6d2cb3e3caad351a1131b02c5e5b64f45536d4b390d5cd5d930802b3106"} Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.701837 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.705079 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" event={"ID":"7aae5da4-fdd1-4295-bfed-a10638501acf","Type":"ContainerStarted","Data":"2f819c25995c638bde6c47115f4d51521dfd09f0472faf637263e7f4d116c563"} Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.706666 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.709274 4912 patch_prober.go:28] interesting pod/oauth-openshift-5f78599457-wr7bj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": dial tcp 10.217.0.65:6443: connect: connection refused" start-of-body= Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.709340 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" podUID="7aae5da4-fdd1-4295-bfed-a10638501acf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": dial tcp 10.217.0.65:6443: connect: connection refused" Mar 18 14:26:58 crc kubenswrapper[4912]: I0318 14:26:58.743903 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-6bzx6" Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.063850 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-98dk7" podUID="73cfec7d-c7e6-4beb-9a85-f161c2c7c31a" containerName="registry-server" probeResult="failure" output=< Mar 18 14:26:59 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:26:59 crc kubenswrapper[4912]: > Mar 18 14:26:59 crc kubenswrapper[4912]: E0318 14:26:59.252241 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.415207 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.722836 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerStarted","Data":"590c33b801802fd5e13d72e4806f53a54ed39d99aa1fa5f8070859bbb724517a"} Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.727874 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7eed3616-3121-46a7-9781-b245e6694fb9","Type":"ContainerStarted","Data":"6ed66ec713f1c451b5440e9f2b0008f249d35ed7e7f86f98ac725ec5e3b287c6"} Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.737640 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c665c8f96-4wcs8_0050439a-3a22-49ef-8b64-4fb98592d68b/console/0.log" Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.738331 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c665c8f96-4wcs8" event={"ID":"0050439a-3a22-49ef-8b64-4fb98592d68b","Type":"ContainerStarted","Data":"f772c7e4a5e1288c862b7bada5a02798512843d3df86130e2532661ded806eb1"} Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.763520 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2tbxp" podStartSLOduration=4.008636041 podStartE2EDuration="11.763498928s" podCreationTimestamp="2026-03-18 14:26:48 +0000 UTC" firstStartedPulling="2026-03-18 14:26:51.405997328 +0000 UTC m=+5059.865424753" lastFinishedPulling="2026-03-18 14:26:59.160860215 +0000 UTC m=+5067.620287640" observedRunningTime="2026-03-18 14:26:59.753841216 +0000 UTC m=+5068.213268651" watchObservedRunningTime="2026-03-18 14:26:59.763498928 +0000 UTC m=+5068.222926353" Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.850770 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-grpwd" Mar 18 14:26:59 crc kubenswrapper[4912]: I0318 14:26:59.876898 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f78599457-wr7bj" Mar 18 14:27:00 crc kubenswrapper[4912]: E0318 14:27:00.784356 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.597467 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.598314 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.603707 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.774794 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7eed3616-3121-46a7-9781-b245e6694fb9","Type":"ContainerStarted","Data":"25f6234957145e196f07a49cf1ed149fea7a209dab9e1fc09cbd1f0dac02f4b6"} Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.787604 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c665c8f96-4wcs8" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.807112 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.215273466 podStartE2EDuration="3.807079442s" podCreationTimestamp="2026-03-18 14:26:58 +0000 UTC" firstStartedPulling="2026-03-18 14:26:59.429124937 +0000 UTC m=+5067.888552362" lastFinishedPulling="2026-03-18 14:27:01.020930913 +0000 UTC m=+5069.480358338" observedRunningTime="2026-03-18 14:27:01.791745547 +0000 UTC m=+5070.251172982" watchObservedRunningTime="2026-03-18 14:27:01.807079442 +0000 UTC m=+5070.266506867" Mar 18 14:27:01 crc kubenswrapper[4912]: I0318 14:27:01.812200 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:27:06 crc kubenswrapper[4912]: I0318 14:27:06.759831 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:27:08 crc kubenswrapper[4912]: I0318 14:27:08.043586 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:27:08 crc kubenswrapper[4912]: I0318 14:27:08.200780 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98dk7" Mar 18 14:27:08 crc kubenswrapper[4912]: I0318 14:27:08.837353 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-bklcg"] Mar 18 14:27:08 crc kubenswrapper[4912]: I0318 14:27:08.851939 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-bklcg"] Mar 18 14:27:09 crc kubenswrapper[4912]: I0318 14:27:09.307568 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:27:09 crc kubenswrapper[4912]: I0318 14:27:09.307634 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:27:10 crc kubenswrapper[4912]: I0318 14:27:10.249012 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ade81f-82a5-4b11-9aab-09c3c06cbcb0" path="/var/lib/kubelet/pods/92ade81f-82a5-4b11-9aab-09c3c06cbcb0/volumes" Mar 18 14:27:10 crc kubenswrapper[4912]: I0318 14:27:10.363515 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:10 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:10 crc kubenswrapper[4912]: > Mar 18 14:27:11 crc kubenswrapper[4912]: E0318 14:27:11.234878 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:11 crc kubenswrapper[4912]: I0318 14:27:11.760305 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:27:13 crc kubenswrapper[4912]: E0318 14:27:13.684374 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:16 crc kubenswrapper[4912]: I0318 14:27:16.794971 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:27:18 crc kubenswrapper[4912]: I0318 14:27:18.081856 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-666765756d-v7mtx" Mar 18 14:27:20 crc kubenswrapper[4912]: I0318 14:27:20.249991 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mjtnc" Mar 18 14:27:21 crc kubenswrapper[4912]: I0318 14:27:21.186153 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:21 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:21 crc kubenswrapper[4912]: > Mar 18 14:27:21 crc kubenswrapper[4912]: E0318 14:27:21.689759 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:21 crc kubenswrapper[4912]: I0318 14:27:21.783537 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:27:24 crc kubenswrapper[4912]: I0318 14:27:24.388849 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 14:27:24 crc kubenswrapper[4912]: I0318 14:27:24.534504 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 14:27:24 crc kubenswrapper[4912]: I0318 14:27:24.610623 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 14:27:25 crc kubenswrapper[4912]: I0318 14:27:25.213336 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 14:27:26 crc kubenswrapper[4912]: I0318 14:27:26.783968 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 14:27:28 crc kubenswrapper[4912]: E0318 14:27:28.923378 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:30 crc kubenswrapper[4912]: I0318 14:27:30.367692 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:30 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:30 crc kubenswrapper[4912]: > Mar 18 14:27:31 crc kubenswrapper[4912]: E0318 14:27:31.778150 4912 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7feb8268_723e_408b_b800_744481779d38.slice/crio-conmon-b1b808abd872143b6de9e2208a233c3d13757f4ff3d75f9bf184bb31859f0a34.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.730079 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfzh/must-gather-7mgpg"] Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.746417 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.760210 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nnfzh"/"openshift-service-ca.crt" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.760207 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nnfzh"/"default-dockercfg-gztrn" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.760231 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nnfzh"/"kube-root-ca.crt" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.849021 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnfzh/must-gather-7mgpg"] Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.851908 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.852060 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpjh\" (UniqueName: \"kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.953650 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.953711 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpjh\" (UniqueName: \"kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:35 crc kubenswrapper[4912]: I0318 14:27:35.956988 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:36 crc kubenswrapper[4912]: I0318 14:27:36.135575 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpjh\" (UniqueName: \"kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh\") pod \"must-gather-7mgpg\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:36 crc kubenswrapper[4912]: I0318 14:27:36.386968 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:27:37 crc kubenswrapper[4912]: I0318 14:27:36.999404 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:27:37 crc kubenswrapper[4912]: I0318 14:27:37.000207 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:27:37 crc kubenswrapper[4912]: I0318 14:27:37.908101 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnfzh/must-gather-7mgpg"] Mar 18 14:27:37 crc kubenswrapper[4912]: W0318 14:27:37.926300 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973a8efa_1884_42ca_92ff_901de8a4fb85.slice/crio-edf446b850e7427173b1a84c96a931caa6e40c32b0b042787120e94a9fe910ae WatchSource:0}: Error finding container edf446b850e7427173b1a84c96a931caa6e40c32b0b042787120e94a9fe910ae: Status 404 returned error can't find the container with id edf446b850e7427173b1a84c96a931caa6e40c32b0b042787120e94a9fe910ae Mar 18 14:27:38 crc kubenswrapper[4912]: I0318 14:27:38.484192 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" event={"ID":"973a8efa-1884-42ca-92ff-901de8a4fb85","Type":"ContainerStarted","Data":"edf446b850e7427173b1a84c96a931caa6e40c32b0b042787120e94a9fe910ae"} Mar 18 14:27:40 crc kubenswrapper[4912]: I0318 14:27:40.372226 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:40 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:40 crc kubenswrapper[4912]: > Mar 18 14:27:48 crc kubenswrapper[4912]: I0318 14:27:48.690068 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" event={"ID":"973a8efa-1884-42ca-92ff-901de8a4fb85","Type":"ContainerStarted","Data":"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5"} Mar 18 14:27:49 crc kubenswrapper[4912]: I0318 14:27:49.710432 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" event={"ID":"973a8efa-1884-42ca-92ff-901de8a4fb85","Type":"ContainerStarted","Data":"033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf"} Mar 18 14:27:49 crc kubenswrapper[4912]: I0318 14:27:49.736750 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" podStartSLOduration=5.052503771 podStartE2EDuration="14.735166389s" podCreationTimestamp="2026-03-18 14:27:35 +0000 UTC" firstStartedPulling="2026-03-18 14:27:37.935602051 +0000 UTC m=+5106.395029476" lastFinishedPulling="2026-03-18 14:27:47.618264629 +0000 UTC m=+5116.077692094" observedRunningTime="2026-03-18 14:27:49.728823088 +0000 UTC m=+5118.188250513" watchObservedRunningTime="2026-03-18 14:27:49.735166389 +0000 UTC m=+5118.194593844" Mar 18 14:27:50 crc kubenswrapper[4912]: I0318 14:27:50.372149 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:50 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:50 crc kubenswrapper[4912]: > Mar 18 14:27:56 crc kubenswrapper[4912]: I0318 14:27:56.894377 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-lzcpc"] Mar 18 14:27:56 crc kubenswrapper[4912]: I0318 14:27:56.901477 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.033247 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.033408 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q8k\" (UniqueName: \"kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.136019 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.136155 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q8k\" (UniqueName: \"kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.140415 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.178699 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q8k\" (UniqueName: \"kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k\") pod \"crc-debug-lzcpc\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.226268 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:27:57 crc kubenswrapper[4912]: I0318 14:27:57.827073 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" event={"ID":"65403bef-ab30-4b6c-b9e8-8ca34882eebe","Type":"ContainerStarted","Data":"acd1929075c3b8450d4fe388732c7e7800361425bf74f881e4c395823b95038a"} Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.199160 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564068-2vlzq"] Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.201797 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.205390 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.205389 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.208697 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.215282 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-2vlzq"] Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.274643 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248pb\" (UniqueName: \"kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb\") pod \"auto-csr-approver-29564068-2vlzq\" (UID: \"81eb1334-0249-47b0-a348-570af03963fd\") " pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.376192 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" probeResult="failure" output=< Mar 18 14:28:00 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:28:00 crc kubenswrapper[4912]: > Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.378128 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248pb\" (UniqueName: \"kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb\") pod \"auto-csr-approver-29564068-2vlzq\" (UID: \"81eb1334-0249-47b0-a348-570af03963fd\") " pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.403882 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248pb\" (UniqueName: \"kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb\") pod \"auto-csr-approver-29564068-2vlzq\" (UID: \"81eb1334-0249-47b0-a348-570af03963fd\") " pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:00 crc kubenswrapper[4912]: I0318 14:28:00.581683 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:01 crc kubenswrapper[4912]: I0318 14:28:01.285862 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-2vlzq"] Mar 18 14:28:01 crc kubenswrapper[4912]: W0318 14:28:01.335592 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81eb1334_0249_47b0_a348_570af03963fd.slice/crio-c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04 WatchSource:0}: Error finding container c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04: Status 404 returned error can't find the container with id c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04 Mar 18 14:28:01 crc kubenswrapper[4912]: I0318 14:28:01.930425 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" event={"ID":"81eb1334-0249-47b0-a348-570af03963fd","Type":"ContainerStarted","Data":"c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04"} Mar 18 14:28:04 crc kubenswrapper[4912]: I0318 14:28:04.998598 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" event={"ID":"81eb1334-0249-47b0-a348-570af03963fd","Type":"ContainerStarted","Data":"2033d1c93650c3b613c32a247af0aefa05b6f191190136fc2b92906b83b2f9d9"} Mar 18 14:28:05 crc kubenswrapper[4912]: I0318 14:28:05.036710 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" podStartSLOduration=3.941254165 podStartE2EDuration="5.036677436s" podCreationTimestamp="2026-03-18 14:28:00 +0000 UTC" firstStartedPulling="2026-03-18 14:28:01.348564657 +0000 UTC m=+5129.807992082" lastFinishedPulling="2026-03-18 14:28:02.443987928 +0000 UTC m=+5130.903415353" observedRunningTime="2026-03-18 14:28:05.02091456 +0000 UTC m=+5133.480341985" watchObservedRunningTime="2026-03-18 14:28:05.036677436 +0000 UTC m=+5133.496104861" Mar 18 14:28:07 crc kubenswrapper[4912]: I0318 14:28:06.999926 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:28:07 crc kubenswrapper[4912]: I0318 14:28:07.000713 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:28:07 crc kubenswrapper[4912]: I0318 14:28:07.029836 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" event={"ID":"81eb1334-0249-47b0-a348-570af03963fd","Type":"ContainerDied","Data":"2033d1c93650c3b613c32a247af0aefa05b6f191190136fc2b92906b83b2f9d9"} Mar 18 14:28:07 crc kubenswrapper[4912]: I0318 14:28:07.031138 4912 generic.go:334] "Generic (PLEG): container finished" podID="81eb1334-0249-47b0-a348-570af03963fd" containerID="2033d1c93650c3b613c32a247af0aefa05b6f191190136fc2b92906b83b2f9d9" exitCode=0 Mar 18 14:28:07 crc kubenswrapper[4912]: I0318 14:28:07.753289 4912 scope.go:117] "RemoveContainer" containerID="231f4e711a059e0ade1102d29e6c83307ef7a8431790799d2544bccc92c89808" Mar 18 14:28:09 crc kubenswrapper[4912]: I0318 14:28:09.379154 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:28:09 crc kubenswrapper[4912]: I0318 14:28:09.487128 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:28:09 crc kubenswrapper[4912]: I0318 14:28:09.650562 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:28:11 crc kubenswrapper[4912]: I0318 14:28:11.105448 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2tbxp" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" containerID="cri-o://590c33b801802fd5e13d72e4806f53a54ed39d99aa1fa5f8070859bbb724517a" gracePeriod=2 Mar 18 14:28:12 crc kubenswrapper[4912]: I0318 14:28:12.128416 4912 generic.go:334] "Generic (PLEG): container finished" podID="ee256d64-8bad-4105-8525-88fca8e28757" containerID="590c33b801802fd5e13d72e4806f53a54ed39d99aa1fa5f8070859bbb724517a" exitCode=0 Mar 18 14:28:12 crc kubenswrapper[4912]: I0318 14:28:12.128449 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerDied","Data":"590c33b801802fd5e13d72e4806f53a54ed39d99aa1fa5f8070859bbb724517a"} Mar 18 14:28:16 crc kubenswrapper[4912]: E0318 14:28:16.175209 4912 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 18 14:28:16 crc kubenswrapper[4912]: E0318 14:28:16.178588 4912 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5q8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-lzcpc_openshift-must-gather-nnfzh(65403bef-ab30-4b6c-b9e8-8ca34882eebe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:28:16 crc kubenswrapper[4912]: E0318 14:28:16.180236 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.215962 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" event={"ID":"81eb1334-0249-47b0-a348-570af03963fd","Type":"ContainerDied","Data":"c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04"} Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.221069 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c813e364b05c205576f8aa9aa6122e2caef598584abaa2389c4f714a1b7afc04" Mar 18 14:28:16 crc kubenswrapper[4912]: E0318 14:28:16.259228 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.377297 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.456601 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-248pb\" (UniqueName: \"kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb\") pod \"81eb1334-0249-47b0-a348-570af03963fd\" (UID: \"81eb1334-0249-47b0-a348-570af03963fd\") " Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.475286 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb" (OuterVolumeSpecName: "kube-api-access-248pb") pod "81eb1334-0249-47b0-a348-570af03963fd" (UID: "81eb1334-0249-47b0-a348-570af03963fd"). InnerVolumeSpecName "kube-api-access-248pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.559891 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-248pb\" (UniqueName: \"kubernetes.io/projected/81eb1334-0249-47b0-a348-570af03963fd-kube-api-access-248pb\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.873074 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.971229 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content\") pod \"ee256d64-8bad-4105-8525-88fca8e28757\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.971298 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities\") pod \"ee256d64-8bad-4105-8525-88fca8e28757\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.971356 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8h6\" (UniqueName: \"kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6\") pod \"ee256d64-8bad-4105-8525-88fca8e28757\" (UID: \"ee256d64-8bad-4105-8525-88fca8e28757\") " Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.973603 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities" (OuterVolumeSpecName: "utilities") pod "ee256d64-8bad-4105-8525-88fca8e28757" (UID: "ee256d64-8bad-4105-8525-88fca8e28757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:28:16 crc kubenswrapper[4912]: I0318 14:28:16.991132 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6" (OuterVolumeSpecName: "kube-api-access-7k8h6") pod "ee256d64-8bad-4105-8525-88fca8e28757" (UID: "ee256d64-8bad-4105-8525-88fca8e28757"). InnerVolumeSpecName "kube-api-access-7k8h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.074572 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.074612 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8h6\" (UniqueName: \"kubernetes.io/projected/ee256d64-8bad-4105-8525-88fca8e28757-kube-api-access-7k8h6\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.198218 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee256d64-8bad-4105-8525-88fca8e28757" (UID: "ee256d64-8bad-4105-8525-88fca8e28757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.232140 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-2vlzq" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.232178 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tbxp" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.232143 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tbxp" event={"ID":"ee256d64-8bad-4105-8525-88fca8e28757","Type":"ContainerDied","Data":"d0bec1fa32a51d9e8cd5bd4d92f5d12afdc8103e0ff5cd4aa96e342966c094a8"} Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.232277 4912 scope.go:117] "RemoveContainer" containerID="590c33b801802fd5e13d72e4806f53a54ed39d99aa1fa5f8070859bbb724517a" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.271919 4912 scope.go:117] "RemoveContainer" containerID="bcfbb6d2cb3e3caad351a1131b02c5e5b64f45536d4b390d5cd5d930802b3106" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.280850 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee256d64-8bad-4105-8525-88fca8e28757-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.297655 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.315268 4912 scope.go:117] "RemoveContainer" containerID="517ddabc9010b1457053fd0adee77afea24872b67feb00ed0ffd7cd128c4017e" Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.321670 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2tbxp"] Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.503577 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-vrtbg"] Mar 18 14:28:17 crc kubenswrapper[4912]: I0318 14:28:17.523452 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-vrtbg"] Mar 18 14:28:18 crc kubenswrapper[4912]: I0318 14:28:18.242526 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a2eef2-104c-46be-ae21-de8f5817f558" path="/var/lib/kubelet/pods/56a2eef2-104c-46be-ae21-de8f5817f558/volumes" Mar 18 14:28:18 crc kubenswrapper[4912]: I0318 14:28:18.256881 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee256d64-8bad-4105-8525-88fca8e28757" path="/var/lib/kubelet/pods/ee256d64-8bad-4105-8525-88fca8e28757/volumes" Mar 18 14:28:30 crc kubenswrapper[4912]: I0318 14:28:30.235442 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:28:31 crc kubenswrapper[4912]: I0318 14:28:31.818900 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" event={"ID":"65403bef-ab30-4b6c-b9e8-8ca34882eebe","Type":"ContainerStarted","Data":"621ae32aaef30236fd469b758aba279af9bdd5cb4963a9cabc8231b9d9623150"} Mar 18 14:28:31 crc kubenswrapper[4912]: I0318 14:28:31.838869 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" podStartSLOduration=2.385061272 podStartE2EDuration="35.838847843s" podCreationTimestamp="2026-03-18 14:27:56 +0000 UTC" firstStartedPulling="2026-03-18 14:27:57.290306089 +0000 UTC m=+5125.749733514" lastFinishedPulling="2026-03-18 14:28:30.74409266 +0000 UTC m=+5159.203520085" observedRunningTime="2026-03-18 14:28:31.835807461 +0000 UTC m=+5160.295234916" watchObservedRunningTime="2026-03-18 14:28:31.838847843 +0000 UTC m=+5160.298275268" Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:36.999369 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.000311 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.000406 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.002894 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.003338 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a" gracePeriod=600 Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.923276 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a" exitCode=0 Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.923345 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a"} Mar 18 14:28:37 crc kubenswrapper[4912]: I0318 14:28:37.923391 4912 scope.go:117] "RemoveContainer" containerID="1b2112c9cf29610b0092143a83ffeccf521a0186447a9b18c72054961169a55a" Mar 18 14:28:39 crc kubenswrapper[4912]: I0318 14:28:39.955651 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589"} Mar 18 14:29:01 crc kubenswrapper[4912]: I0318 14:29:01.246225 4912 generic.go:334] "Generic (PLEG): container finished" podID="fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd" containerID="a8371e44cd45687f253f6ae01e1fd665429fe9f448934a897b3b4aed3a2a1268" exitCode=0 Mar 18 14:29:01 crc kubenswrapper[4912]: I0318 14:29:01.247011 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" event={"ID":"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd","Type":"ContainerDied","Data":"a8371e44cd45687f253f6ae01e1fd665429fe9f448934a897b3b4aed3a2a1268"} Mar 18 14:29:01 crc kubenswrapper[4912]: I0318 14:29:01.248680 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" event={"ID":"fd168f8f-d6fd-4731-9a0a-bd9ab50ef9cd","Type":"ContainerStarted","Data":"c4f188d75f3bb006af626ce9c0ad701cdfc291dd272f6d89f1f73bfeff43fb24"} Mar 18 14:29:08 crc kubenswrapper[4912]: I0318 14:29:08.603640 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 14:29:08 crc kubenswrapper[4912]: I0318 14:29:08.605785 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 14:29:16 crc kubenswrapper[4912]: I0318 14:29:16.417690 4912 scope.go:117] "RemoveContainer" containerID="6c0b5a8916f0a127d2dc34d512d26d5d805a12d95c18bb74bef03414ef89cd37" Mar 18 14:29:19 crc kubenswrapper[4912]: I0318 14:29:19.505898 4912 generic.go:334] "Generic (PLEG): container finished" podID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" containerID="621ae32aaef30236fd469b758aba279af9bdd5cb4963a9cabc8231b9d9623150" exitCode=0 Mar 18 14:29:19 crc kubenswrapper[4912]: I0318 14:29:19.506562 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" event={"ID":"65403bef-ab30-4b6c-b9e8-8ca34882eebe","Type":"ContainerDied","Data":"621ae32aaef30236fd469b758aba279af9bdd5cb4963a9cabc8231b9d9623150"} Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.647805 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.688406 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-lzcpc"] Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.700469 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-lzcpc"] Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.824785 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host\") pod \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.824955 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host" (OuterVolumeSpecName: "host") pod "65403bef-ab30-4b6c-b9e8-8ca34882eebe" (UID: "65403bef-ab30-4b6c-b9e8-8ca34882eebe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.825377 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5q8k\" (UniqueName: \"kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k\") pod \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\" (UID: \"65403bef-ab30-4b6c-b9e8-8ca34882eebe\") " Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.826364 4912 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65403bef-ab30-4b6c-b9e8-8ca34882eebe-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.845634 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k" (OuterVolumeSpecName: "kube-api-access-m5q8k") pod "65403bef-ab30-4b6c-b9e8-8ca34882eebe" (UID: "65403bef-ab30-4b6c-b9e8-8ca34882eebe"). InnerVolumeSpecName "kube-api-access-m5q8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:20 crc kubenswrapper[4912]: I0318 14:29:20.928403 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5q8k\" (UniqueName: \"kubernetes.io/projected/65403bef-ab30-4b6c-b9e8-8ca34882eebe-kube-api-access-m5q8k\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:21 crc kubenswrapper[4912]: I0318 14:29:21.532421 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd1929075c3b8450d4fe388732c7e7800361425bf74f881e4c395823b95038a" Mar 18 14:29:21 crc kubenswrapper[4912]: I0318 14:29:21.532508 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-lzcpc" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.077347 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-xstts"] Mar 18 14:29:22 crc kubenswrapper[4912]: E0318 14:29:22.078395 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="extract-content" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078412 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="extract-content" Mar 18 14:29:22 crc kubenswrapper[4912]: E0318 14:29:22.078436 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81eb1334-0249-47b0-a348-570af03963fd" containerName="oc" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078442 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="81eb1334-0249-47b0-a348-570af03963fd" containerName="oc" Mar 18 14:29:22 crc kubenswrapper[4912]: E0318 14:29:22.078467 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078473 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" Mar 18 14:29:22 crc kubenswrapper[4912]: E0318 14:29:22.078491 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="extract-utilities" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078497 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="extract-utilities" Mar 18 14:29:22 crc kubenswrapper[4912]: E0318 14:29:22.078513 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" containerName="container-00" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078519 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" containerName="container-00" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078748 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="81eb1334-0249-47b0-a348-570af03963fd" containerName="oc" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078769 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" containerName="container-00" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.078790 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee256d64-8bad-4105-8525-88fca8e28757" containerName="registry-server" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.079718 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.160727 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.160903 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzdv\" (UniqueName: \"kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.246969 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65403bef-ab30-4b6c-b9e8-8ca34882eebe" path="/var/lib/kubelet/pods/65403bef-ab30-4b6c-b9e8-8ca34882eebe/volumes" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.263026 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.263154 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.263161 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzdv\" (UniqueName: \"kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.286247 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzdv\" (UniqueName: \"kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv\") pod \"crc-debug-xstts\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.401013 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:22 crc kubenswrapper[4912]: I0318 14:29:22.550807 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-xstts" event={"ID":"4273c379-6d8c-47b1-9b37-7b3bc841cfcf","Type":"ContainerStarted","Data":"d6e13f1816d39967afdd8566cdd3cd8d7438075c3ec07a8874375523bcad755d"} Mar 18 14:29:23 crc kubenswrapper[4912]: I0318 14:29:23.565060 4912 generic.go:334] "Generic (PLEG): container finished" podID="4273c379-6d8c-47b1-9b37-7b3bc841cfcf" containerID="4ca705a1debba8c103772ef1f6f76ca88a83286964454a2e5d9d0911bf21cf57" exitCode=0 Mar 18 14:29:23 crc kubenswrapper[4912]: I0318 14:29:23.565508 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-xstts" event={"ID":"4273c379-6d8c-47b1-9b37-7b3bc841cfcf","Type":"ContainerDied","Data":"4ca705a1debba8c103772ef1f6f76ca88a83286964454a2e5d9d0911bf21cf57"} Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.704602 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-xstts"] Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.708265 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.722435 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-xstts"] Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.838514 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgzdv\" (UniqueName: \"kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv\") pod \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.838677 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host\") pod \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\" (UID: \"4273c379-6d8c-47b1-9b37-7b3bc841cfcf\") " Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.838846 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host" (OuterVolumeSpecName: "host") pod "4273c379-6d8c-47b1-9b37-7b3bc841cfcf" (UID: "4273c379-6d8c-47b1-9b37-7b3bc841cfcf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.839368 4912 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.849665 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv" (OuterVolumeSpecName: "kube-api-access-sgzdv") pod "4273c379-6d8c-47b1-9b37-7b3bc841cfcf" (UID: "4273c379-6d8c-47b1-9b37-7b3bc841cfcf"). InnerVolumeSpecName "kube-api-access-sgzdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:24 crc kubenswrapper[4912]: I0318 14:29:24.942460 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgzdv\" (UniqueName: \"kubernetes.io/projected/4273c379-6d8c-47b1-9b37-7b3bc841cfcf-kube-api-access-sgzdv\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.592063 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e13f1816d39967afdd8566cdd3cd8d7438075c3ec07a8874375523bcad755d" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.592100 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-xstts" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.879733 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-cd69s"] Mar 18 14:29:25 crc kubenswrapper[4912]: E0318 14:29:25.881597 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4273c379-6d8c-47b1-9b37-7b3bc841cfcf" containerName="container-00" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.881622 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4273c379-6d8c-47b1-9b37-7b3bc841cfcf" containerName="container-00" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.881858 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4273c379-6d8c-47b1-9b37-7b3bc841cfcf" containerName="container-00" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.882776 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.970238 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:25 crc kubenswrapper[4912]: I0318 14:29:25.970339 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jplwc\" (UniqueName: \"kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.076962 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.077167 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jplwc\" (UniqueName: \"kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.077514 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.101122 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jplwc\" (UniqueName: \"kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc\") pod \"crc-debug-cd69s\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.201795 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.285833 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4273c379-6d8c-47b1-9b37-7b3bc841cfcf" path="/var/lib/kubelet/pods/4273c379-6d8c-47b1-9b37-7b3bc841cfcf/volumes" Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.628338 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" event={"ID":"b9aef372-e9d0-4a82-8ea5-53626d9564b0","Type":"ContainerStarted","Data":"160ef8d530f8d722fd19a0088c92267ed499d40c26acc2116060445ebba2bf9f"} Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.628390 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" event={"ID":"b9aef372-e9d0-4a82-8ea5-53626d9564b0","Type":"ContainerStarted","Data":"c1684226bb3d72092feda2d280149ba0b1798ee39b9c8a5126abb593bf9af35a"} Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.703067 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-cd69s"] Mar 18 14:29:26 crc kubenswrapper[4912]: I0318 14:29:26.715122 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfzh/crc-debug-cd69s"] Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.644615 4912 generic.go:334] "Generic (PLEG): container finished" podID="b9aef372-e9d0-4a82-8ea5-53626d9564b0" containerID="160ef8d530f8d722fd19a0088c92267ed499d40c26acc2116060445ebba2bf9f" exitCode=0 Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.799476 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.963153 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jplwc\" (UniqueName: \"kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc\") pod \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.963243 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host\") pod \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\" (UID: \"b9aef372-e9d0-4a82-8ea5-53626d9564b0\") " Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.963375 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host" (OuterVolumeSpecName: "host") pod "b9aef372-e9d0-4a82-8ea5-53626d9564b0" (UID: "b9aef372-e9d0-4a82-8ea5-53626d9564b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.963975 4912 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9aef372-e9d0-4a82-8ea5-53626d9564b0-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:27 crc kubenswrapper[4912]: I0318 14:29:27.977280 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc" (OuterVolumeSpecName: "kube-api-access-jplwc") pod "b9aef372-e9d0-4a82-8ea5-53626d9564b0" (UID: "b9aef372-e9d0-4a82-8ea5-53626d9564b0"). InnerVolumeSpecName "kube-api-access-jplwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.066916 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jplwc\" (UniqueName: \"kubernetes.io/projected/b9aef372-e9d0-4a82-8ea5-53626d9564b0-kube-api-access-jplwc\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.244120 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9aef372-e9d0-4a82-8ea5-53626d9564b0" path="/var/lib/kubelet/pods/b9aef372-e9d0-4a82-8ea5-53626d9564b0/volumes" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.609925 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.614622 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-cdbfdff57-fmktk" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.660219 4912 scope.go:117] "RemoveContainer" containerID="160ef8d530f8d722fd19a0088c92267ed499d40c26acc2116060445ebba2bf9f" Mar 18 14:29:28 crc kubenswrapper[4912]: I0318 14:29:28.660252 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/crc-debug-cd69s" Mar 18 14:29:58 crc kubenswrapper[4912]: I0318 14:29:58.609298 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_122ef44f-951b-4aa0-bef1-d190f7b5a495/aodh-api/0.log" Mar 18 14:29:58 crc kubenswrapper[4912]: I0318 14:29:58.790506 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_122ef44f-951b-4aa0-bef1-d190f7b5a495/aodh-evaluator/0.log" Mar 18 14:29:58 crc kubenswrapper[4912]: I0318 14:29:58.883534 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_122ef44f-951b-4aa0-bef1-d190f7b5a495/aodh-notifier/0.log" Mar 18 14:29:58 crc kubenswrapper[4912]: I0318 14:29:58.900196 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_122ef44f-951b-4aa0-bef1-d190f7b5a495/aodh-listener/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.107016 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59f85f449d-2mslp_c5262960-4228-43dd-a5d9-0fcdfe8111c3/barbican-api/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.172526 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59f85f449d-2mslp_c5262960-4228-43dd-a5d9-0fcdfe8111c3/barbican-api-log/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.315681 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7965f976fd-25ct8_94867b1a-6891-4d44-b968-0b18a8b30085/barbican-keystone-listener/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.518958 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7965f976fd-25ct8_94867b1a-6891-4d44-b968-0b18a8b30085/barbican-keystone-listener-log/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.588245 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-79f58cdfc5-fz7qx_85dfe685-650b-44a6-b164-137cae893166/barbican-worker-log/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.629824 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-79f58cdfc5-fz7qx_85dfe685-650b-44a6-b164-137cae893166/barbican-worker/0.log" Mar 18 14:29:59 crc kubenswrapper[4912]: I0318 14:29:59.938550 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-x8s49_c04db868-dfd5-464a-97c3-437a011e243a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.019194 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b/ceilometer-central-agent/1.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.152167 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564070-qvgzj"] Mar 18 14:30:00 crc kubenswrapper[4912]: E0318 14:30:00.152979 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9aef372-e9d0-4a82-8ea5-53626d9564b0" containerName="container-00" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.153094 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9aef372-e9d0-4a82-8ea5-53626d9564b0" containerName="container-00" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.153439 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9aef372-e9d0-4a82-8ea5-53626d9564b0" containerName="container-00" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.154600 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.161656 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.161854 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.162136 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.170742 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-qvgzj"] Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.188560 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98"] Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.192915 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv29v\" (UniqueName: \"kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v\") pod \"auto-csr-approver-29564070-qvgzj\" (UID: \"49c1e3c5-c136-42cb-a0f6-4067a05a67c4\") " pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.194838 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.199495 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.200193 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.266244 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98"] Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.296789 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b/ceilometer-notification-agent/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.296896 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv29v\" (UniqueName: \"kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v\") pod \"auto-csr-approver-29564070-qvgzj\" (UID: \"49c1e3c5-c136-42cb-a0f6-4067a05a67c4\") " pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.340929 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv29v\" (UniqueName: \"kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v\") pod \"auto-csr-approver-29564070-qvgzj\" (UID: \"49c1e3c5-c136-42cb-a0f6-4067a05a67c4\") " pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.347415 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b/ceilometer-central-agent/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.359302 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b/proxy-httpd/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.375488 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eb35fc5a-3fa3-4bd4-91d8-f5a004576e3b/sg-core/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.399589 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hls2\" (UniqueName: \"kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.399900 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.400078 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.481461 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.503584 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hls2\" (UniqueName: \"kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.503731 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.503815 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.504941 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.512700 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.525565 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hls2\" (UniqueName: \"kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2\") pod \"collect-profiles-29564070-pbs98\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.548785 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.673667 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f74a682c-ca05-498c-ab11-4ccf3d7d3b46/cinder-api/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.722999 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f74a682c-ca05-498c-ab11-4ccf3d7d3b46/cinder-api-log/0.log" Mar 18 14:30:00 crc kubenswrapper[4912]: I0318 14:30:00.975643 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc/cinder-scheduler/0.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.052543 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc/cinder-scheduler/1.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.157256 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-qvgzj"] Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.175426 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ab3ea7b1-2b5f-4850-92b1-4f6585fc1bbc/probe/0.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.335382 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98"] Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.344303 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rgj4t_da62fce5-ea10-4763-b58f-81932668abee/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.523922 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m5q8h_e89a9418-20a0-4b00-9012-5c17c43b7170/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.662626 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-8qfhh_0db2a85b-f256-4ef8-b380-5831240903c7/init/0.log" Mar 18 14:30:01 crc kubenswrapper[4912]: I0318 14:30:01.947066 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-8qfhh_0db2a85b-f256-4ef8-b380-5831240903c7/init/0.log" Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.002283 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mjb6w_c70d046b-5ae9-4514-ac8b-7904ab66c16b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.015916 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-8qfhh_0db2a85b-f256-4ef8-b380-5831240903c7/dnsmasq-dns/0.log" Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.154453 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" event={"ID":"b531f4d4-ef53-4c22-bdd9-91fedb2f4971","Type":"ContainerStarted","Data":"192ec4e4a80996c02d14c5ecb4da97dc3c5cbd9c0256b7bc77881e7210f65aa3"} Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.154525 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" event={"ID":"b531f4d4-ef53-4c22-bdd9-91fedb2f4971","Type":"ContainerStarted","Data":"fa481fe1691f60e72374b33d976a9cc71053ab71d98a620a75da04cc8a93cb91"} Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.160510 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" event={"ID":"49c1e3c5-c136-42cb-a0f6-4067a05a67c4","Type":"ContainerStarted","Data":"f170a42afa8458b3293d468785bfabc524dab6cee8994b49dbe380ae24a9ca65"} Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.186358 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" podStartSLOduration=2.186327939 podStartE2EDuration="2.186327939s" podCreationTimestamp="2026-03-18 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:30:02.177823519 +0000 UTC m=+5250.637250954" watchObservedRunningTime="2026-03-18 14:30:02.186327939 +0000 UTC m=+5250.645755364" Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.686071 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8dc0c338-1e2c-43b7-9d84-96a42e7df1a5/glance-httpd/0.log" Mar 18 14:30:02 crc kubenswrapper[4912]: I0318 14:30:02.718119 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8dc0c338-1e2c-43b7-9d84-96a42e7df1a5/glance-log/0.log" Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.008471 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_84708bef-9104-4bd8-8437-b068dfcb9f65/glance-httpd/0.log" Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.016429 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_84708bef-9104-4bd8-8437-b068dfcb9f65/glance-log/0.log" Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.183784 4912 generic.go:334] "Generic (PLEG): container finished" podID="b531f4d4-ef53-4c22-bdd9-91fedb2f4971" containerID="192ec4e4a80996c02d14c5ecb4da97dc3c5cbd9c0256b7bc77881e7210f65aa3" exitCode=0 Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.183874 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" event={"ID":"b531f4d4-ef53-4c22-bdd9-91fedb2f4971","Type":"ContainerDied","Data":"192ec4e4a80996c02d14c5ecb4da97dc3c5cbd9c0256b7bc77881e7210f65aa3"} Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.818567 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8xpnq_429eaf59-f68a-4347-8e97-77c61e6213e3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:03 crc kubenswrapper[4912]: I0318 14:30:03.820490 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-84cf4c78d4-lld2l_4569488e-f272-42ff-a887-5eb7d399cece/heat-engine/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.125470 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-62zpp_57f0a91c-168e-4a04-a0fa-5d1ea81eea22/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.141983 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6cb8998785-grlzs_47d6eba1-db73-4357-86a6-c15e562f20bd/heat-api/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.154155 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5d44c98778-8qzdk_1063826f-dbd1-4b72-b541-dd9832dd788c/heat-cfnapi/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.202837 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" event={"ID":"49c1e3c5-c136-42cb-a0f6-4067a05a67c4","Type":"ContainerStarted","Data":"11ee896c8d4d04e24db8c51e305d28fdea4e637a223d6df842532d86fa322ab7"} Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.235860 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" podStartSLOduration=2.14386339 podStartE2EDuration="4.235828524s" podCreationTimestamp="2026-03-18 14:30:00 +0000 UTC" firstStartedPulling="2026-03-18 14:30:01.196241789 +0000 UTC m=+5249.655669214" lastFinishedPulling="2026-03-18 14:30:03.288206923 +0000 UTC m=+5251.747634348" observedRunningTime="2026-03-18 14:30:04.221120176 +0000 UTC m=+5252.680547601" watchObservedRunningTime="2026-03-18 14:30:04.235828524 +0000 UTC m=+5252.695255969" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.571574 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4afb2214-0d5c-469e-8763-580ea6d84b7d/kube-state-metrics/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.573372 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564041-fj5hw_05fe35d1-77ac-4349-b1a0-25cfd25bc5a8/keystone-cron/0.log" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.830164 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.932628 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hls2\" (UniqueName: \"kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2\") pod \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.932884 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume\") pod \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.933179 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume\") pod \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\" (UID: \"b531f4d4-ef53-4c22-bdd9-91fedb2f4971\") " Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.938342 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume" (OuterVolumeSpecName: "config-volume") pod "b531f4d4-ef53-4c22-bdd9-91fedb2f4971" (UID: "b531f4d4-ef53-4c22-bdd9-91fedb2f4971"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.991691 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2" (OuterVolumeSpecName: "kube-api-access-7hls2") pod "b531f4d4-ef53-4c22-bdd9-91fedb2f4971" (UID: "b531f4d4-ef53-4c22-bdd9-91fedb2f4971"). InnerVolumeSpecName "kube-api-access-7hls2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:04 crc kubenswrapper[4912]: I0318 14:30:04.993779 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b531f4d4-ef53-4c22-bdd9-91fedb2f4971" (UID: "b531f4d4-ef53-4c22-bdd9-91fedb2f4971"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.006383 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wp8kj_04124f84-2385-4e87-b1c6-ac325ca92d7a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.037387 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hls2\" (UniqueName: \"kubernetes.io/projected/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-kube-api-access-7hls2\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.037428 4912 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.037440 4912 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b531f4d4-ef53-4c22-bdd9-91fedb2f4971-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.051862 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-bvtqd_555eb0bd-76ea-4584-b984-fcf3ee653fe9/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.085106 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79d7d9b7f7-jbmpn_5c102a6b-029b-47f9-bdd5-66fb03606564/keystone-api/0.log" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.219661 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" event={"ID":"b531f4d4-ef53-4c22-bdd9-91fedb2f4971","Type":"ContainerDied","Data":"fa481fe1691f60e72374b33d976a9cc71053ab71d98a620a75da04cc8a93cb91"} Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.219776 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa481fe1691f60e72374b33d976a9cc71053ab71d98a620a75da04cc8a93cb91" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.219712 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-pbs98" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.224502 4912 generic.go:334] "Generic (PLEG): container finished" podID="49c1e3c5-c136-42cb-a0f6-4067a05a67c4" containerID="11ee896c8d4d04e24db8c51e305d28fdea4e637a223d6df842532d86fa322ab7" exitCode=0 Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.224573 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" event={"ID":"49c1e3c5-c136-42cb-a0f6-4067a05a67c4","Type":"ContainerDied","Data":"11ee896c8d4d04e24db8c51e305d28fdea4e637a223d6df842532d86fa322ab7"} Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.293886 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_c50a92bb-6367-46ff-8f61-2cc2418f9f6e/mysqld-exporter/0.log" Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.938767 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2"] Mar 18 14:30:05 crc kubenswrapper[4912]: I0318 14:30:05.955659 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-5sjt2"] Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.253553 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39e1ae8-1b4b-4562-aa08-04d7a1023654" path="/var/lib/kubelet/pods/c39e1ae8-1b4b-4562-aa08-04d7a1023654/volumes" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.391914 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d57nh_0e4b3676-5cda-4170-86e4-ce503ec22aa0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.453031 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d86cc5c8f-fqc82_6f2b7b0b-4b03-441d-9c94-606e57f8e710/neutron-httpd/0.log" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.605885 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d86cc5c8f-fqc82_6f2b7b0b-4b03-441d-9c94-606e57f8e710/neutron-api/0.log" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.760722 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.787435 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv29v\" (UniqueName: \"kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v\") pod \"49c1e3c5-c136-42cb-a0f6-4067a05a67c4\" (UID: \"49c1e3c5-c136-42cb-a0f6-4067a05a67c4\") " Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.835395 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v" (OuterVolumeSpecName: "kube-api-access-qv29v") pod "49c1e3c5-c136-42cb-a0f6-4067a05a67c4" (UID: "49c1e3c5-c136-42cb-a0f6-4067a05a67c4"). InnerVolumeSpecName "kube-api-access-qv29v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:06 crc kubenswrapper[4912]: I0318 14:30:06.894356 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv29v\" (UniqueName: \"kubernetes.io/projected/49c1e3c5-c136-42cb-a0f6-4067a05a67c4-kube-api-access-qv29v\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.276618 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" event={"ID":"49c1e3c5-c136-42cb-a0f6-4067a05a67c4","Type":"ContainerDied","Data":"f170a42afa8458b3293d468785bfabc524dab6cee8994b49dbe380ae24a9ca65"} Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.276727 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f170a42afa8458b3293d468785bfabc524dab6cee8994b49dbe380ae24a9ca65" Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.276786 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-qvgzj" Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.312926 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-bn24n"] Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.331251 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-bn24n"] Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.568250 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b1e5f76f-6db6-442d-9e3d-9b8a1de16910/nova-api-log/0.log" Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.579990 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_be757545-6411-4e1a-bd46-6cddf5a22d61/nova-cell0-conductor-conductor/0.log" Mar 18 14:30:07 crc kubenswrapper[4912]: I0318 14:30:07.895183 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_78b0bbd2-0094-4cf8-b5ae-624dde267b12/nova-cell1-conductor-conductor/0.log" Mar 18 14:30:08 crc kubenswrapper[4912]: I0318 14:30:08.074692 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7358c044-d1cd-4087-b377-06d6bf36d82b/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 14:30:08 crc kubenswrapper[4912]: I0318 14:30:08.245673 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7381c303-a98e-4755-91f4-beab8d1ab273" path="/var/lib/kubelet/pods/7381c303-a98e-4755-91f4-beab8d1ab273/volumes" Mar 18 14:30:08 crc kubenswrapper[4912]: I0318 14:30:08.260949 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7rh7k_3db96e35-5cad-42d1-afe8-bf48fa9ac92e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:08 crc kubenswrapper[4912]: I0318 14:30:08.397535 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b1e5f76f-6db6-442d-9e3d-9b8a1de16910/nova-api-api/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.213802 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0/nova-metadata-log/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.581122 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77736799-2ebe-4076-9717-6741aed93599/mysql-bootstrap/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.681321 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0f957cfd-c546-4a96-a235-ff2d1475ff7a/nova-scheduler-scheduler/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.758322 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77736799-2ebe-4076-9717-6741aed93599/mysql-bootstrap/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.857422 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77736799-2ebe-4076-9717-6741aed93599/galera/1.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.861836 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7f51ee38-7dc6-4a34-a6aa-941af6d9a7e0/nova-metadata-metadata/0.log" Mar 18 14:30:09 crc kubenswrapper[4912]: I0318 14:30:09.957194 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77736799-2ebe-4076-9717-6741aed93599/galera/0.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.121430 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0973556-9c2c-4037-b800-d11ecf1904cc/mysql-bootstrap/0.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.460831 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0973556-9c2c-4037-b800-d11ecf1904cc/mysql-bootstrap/0.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.487908 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0973556-9c2c-4037-b800-d11ecf1904cc/galera/0.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.557025 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0973556-9c2c-4037-b800-d11ecf1904cc/galera/1.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.758154 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_311d61bd-9241-486c-a8d5-22fc93f208bc/openstackclient/0.log" Mar 18 14:30:10 crc kubenswrapper[4912]: I0318 14:30:10.869943 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7nd97_5353be6e-99f8-4367-a237-99e0bd3bab04/ovn-controller/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.048670 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qqfb7_f2b4068a-eb2f-4744-afc6-353f9704e68f/openstack-network-exporter/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.244588 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbb6v_2524b573-8f88-4fd6-8b1d-c3a4f39e0620/ovsdb-server-init/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.461999 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbb6v_2524b573-8f88-4fd6-8b1d-c3a4f39e0620/ovs-vswitchd/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.542715 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbb6v_2524b573-8f88-4fd6-8b1d-c3a4f39e0620/ovsdb-server-init/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.561615 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tbb6v_2524b573-8f88-4fd6-8b1d-c3a4f39e0620/ovsdb-server/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.818668 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_85faf7f2-1b95-4210-88b1-cae393033960/openstack-network-exporter/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.821870 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jljj4_2fd0ee26-077e-472f-9bfb-0f6247895102/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:11 crc kubenswrapper[4912]: I0318 14:30:11.905068 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_85faf7f2-1b95-4210-88b1-cae393033960/ovn-northd/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.098370 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6bc55c08-667c-4803-86d4-e30cf29b4bb6/openstack-network-exporter/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.122599 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6bc55c08-667c-4803-86d4-e30cf29b4bb6/ovsdbserver-nb/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.327670 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b193ddb0-beb0-47c2-80c0-a301e580d2b1/openstack-network-exporter/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.396826 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b193ddb0-beb0-47c2-80c0-a301e580d2b1/ovsdbserver-sb/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.775376 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f6bb888-s77j2_7dda5aa6-ee95-4b75-9204-b014aba202ae/placement-api/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.780399 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a406878a-6e90-4c47-8e23-875349b55b1d/init-config-reloader/0.log" Mar 18 14:30:12 crc kubenswrapper[4912]: I0318 14:30:12.801551 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f6bb888-s77j2_7dda5aa6-ee95-4b75-9204-b014aba202ae/placement-log/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.067368 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a406878a-6e90-4c47-8e23-875349b55b1d/prometheus/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.070835 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a406878a-6e90-4c47-8e23-875349b55b1d/thanos-sidecar/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.093350 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a406878a-6e90-4c47-8e23-875349b55b1d/config-reloader/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.103468 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a406878a-6e90-4c47-8e23-875349b55b1d/init-config-reloader/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.335117 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7b0b7b32-0583-4813-b9fd-9697bf4e9d05/setup-container/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.609400 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7b0b7b32-0583-4813-b9fd-9697bf4e9d05/rabbitmq/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.622959 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7b0b7b32-0583-4813-b9fd-9697bf4e9d05/setup-container/0.log" Mar 18 14:30:13 crc kubenswrapper[4912]: I0318 14:30:13.748677 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_775a5a2c-1365-4984-9e9f-a11cd7f48bb9/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.142649 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_775a5a2c-1365-4984-9e9f-a11cd7f48bb9/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.143176 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_775a5a2c-1365-4984-9e9f-a11cd7f48bb9/rabbitmq/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.166328 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_db9357ac-df66-4e60-bddf-38d4f8847623/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.387703 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_db9357ac-df66-4e60-bddf-38d4f8847623/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.452690 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_db9357ac-df66-4e60-bddf-38d4f8847623/rabbitmq/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.544072 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.743798 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0/setup-container/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.849677 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cbbbd_560fc636-c082-4462-935f-1323ed49eef4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:14 crc kubenswrapper[4912]: I0318 14:30:14.856469 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a8ee8871-d3ec-4147-b2a2-848eb1bbc9e0/rabbitmq/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.092825 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qwgqp_0e7cc04c-de03-4e24-b041-663be152ac0e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.172225 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ssf4v_edb56b41-20f0-40db-925f-fb26ec712461/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.418804 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-54hl4_3fc8fc8f-f538-4240-a174-5144a5592e75/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.564301 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sjjdz_725fe434-fed8-4c32-b6a7-8f320dd6e0fd/ssh-known-hosts-edpm-deployment/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.860930 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f59b977c9-rwwx4_08a4effe-9a7e-449c-aba4-74d4b7a4f0ae/proxy-server/0.log" Mar 18 14:30:15 crc kubenswrapper[4912]: I0318 14:30:15.899954 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f59b977c9-rwwx4_08a4effe-9a7e-449c-aba4-74d4b7a4f0ae/proxy-httpd/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.030992 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cczr8_4b02fe29-bc51-4fc4-86e7-44fb75e20e2b/swift-ring-rebalance/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.197101 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/account-auditor/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.230241 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/account-reaper/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.347871 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/account-replicator/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.472323 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/container-auditor/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.478428 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/account-server/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.588276 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/container-replicator/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.590972 4912 scope.go:117] "RemoveContainer" containerID="ec604b75259b9429055a65227930f180a6208f345639a1334cb863b306e31875" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.659808 4912 scope.go:117] "RemoveContainer" containerID="ba245ed04fc2e89a406c2c3c06ca334c776ff4821974439487e5cdcdd41faa32" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.687454 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/container-server/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.724496 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/container-updater/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.907719 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/object-expirer/0.log" Mar 18 14:30:16 crc kubenswrapper[4912]: I0318 14:30:16.908989 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/object-auditor/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.001576 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/object-server/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.095390 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/object-replicator/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.207462 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/object-updater/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.219182 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/rsync/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.243151 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8f71e79a-72ad-4de7-9b24-7ac75884deae/swift-recon-cron/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.552518 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9pxs7_edc150d2-9448-4f82-a4a4-eeb5b0b06829/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.620337 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-fdxxm_a1bc0fc3-1b8a-4894-ba7c-a4639e6a8660/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:17 crc kubenswrapper[4912]: I0318 14:30:17.867347 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7eed3616-3121-46a7-9781-b245e6694fb9/test-operator-logs-container/0.log" Mar 18 14:30:18 crc kubenswrapper[4912]: I0318 14:30:18.387854 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-csz77_b28d5aa0-b546-4057-80d6-04277d6af5e3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:30:18 crc kubenswrapper[4912]: I0318 14:30:18.461462 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fab7b705-5ef2-46e6-851d-5c38d246ee55/tempest-tests-tempest-tests-runner/0.log" Mar 18 14:30:31 crc kubenswrapper[4912]: I0318 14:30:31.308131 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5f35e63e-80c3-4dca-b383-9650e3aa63a2/memcached/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.064986 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/util/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.401501 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/pull/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.406161 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/util/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.420776 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/pull/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.673137 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/util/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.717319 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/extract/0.log" Mar 18 14:31:00 crc kubenswrapper[4912]: I0318 14:31:00.718689 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_936ef2b154628bef58d6e2cff4fd78009309dd4b343e196a344d7f8160l79gt_3982fa5f-f22b-4e44-8f14-3edfda813bb1/pull/0.log" Mar 18 14:31:02 crc kubenswrapper[4912]: I0318 14:31:02.983011 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-glhjp_e7b90186-2a06-42a0-aec9-8d8f27dfe4dd/manager/1.log" Mar 18 14:31:02 crc kubenswrapper[4912]: I0318 14:31:02.996634 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-pcz7q_2faefcc2-b6a3-4dee-a077-af88038f3565/manager/1.log" Mar 18 14:31:03 crc kubenswrapper[4912]: I0318 14:31:03.259023 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-glhjp_e7b90186-2a06-42a0-aec9-8d8f27dfe4dd/manager/0.log" Mar 18 14:31:03 crc kubenswrapper[4912]: I0318 14:31:03.519336 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-65r9q_1f17f2a1-55b9-493b-9a8a-3d53f21becb9/manager/1.log" Mar 18 14:31:03 crc kubenswrapper[4912]: I0318 14:31:03.780851 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-65r9q_1f17f2a1-55b9-493b-9a8a-3d53f21becb9/manager/0.log" Mar 18 14:31:04 crc kubenswrapper[4912]: I0318 14:31:04.095889 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-wljg2_f695b268-a8b7-4b72-a37b-dd342d7d369a/manager/1.log" Mar 18 14:31:04 crc kubenswrapper[4912]: I0318 14:31:04.265934 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-wljg2_f695b268-a8b7-4b72-a37b-dd342d7d369a/manager/0.log" Mar 18 14:31:04 crc kubenswrapper[4912]: I0318 14:31:04.851998 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-72xxs_98fed63c-9006-4589-a119-1e25fb115041/manager/1.log" Mar 18 14:31:04 crc kubenswrapper[4912]: I0318 14:31:04.933698 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-72xxs_98fed63c-9006-4589-a119-1e25fb115041/manager/0.log" Mar 18 14:31:05 crc kubenswrapper[4912]: I0318 14:31:05.670433 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-6ksxg_334b170e-0f84-42b2-81a6-8c469d187fa3/manager/1.log" Mar 18 14:31:05 crc kubenswrapper[4912]: I0318 14:31:05.820161 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-pcz7q_2faefcc2-b6a3-4dee-a077-af88038f3565/manager/0.log" Mar 18 14:31:05 crc kubenswrapper[4912]: I0318 14:31:05.884806 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-qsghf_b7ec4270-842e-49cb-8d22-16df7b212443/manager/0.log" Mar 18 14:31:05 crc kubenswrapper[4912]: I0318 14:31:05.896282 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-6ksxg_334b170e-0f84-42b2-81a6-8c469d187fa3/manager/0.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.349769 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-vqtmj_8b12ea77-cfde-4e3d-bdc7-04c350f17c09/manager/0.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.369650 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-zp69w_f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0/manager/1.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.408706 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-zp69w_f38ac2a2-fa8e-4c04-a9e0-e495dee1ecf0/manager/0.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.576852 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-l9d25_6afa3dcd-776b-4472-9e54-31e102d2fb67/manager/1.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.761196 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-l9d25_6afa3dcd-776b-4472-9e54-31e102d2fb67/manager/0.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.915189 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-wm76g_6ff20347-b4ef-4d01-966c-5ba69dcf546c/manager/1.log" Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.998816 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:31:06 crc kubenswrapper[4912]: I0318 14:31:06.998874 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.114667 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-wm76g_6ff20347-b4ef-4d01-966c-5ba69dcf546c/manager/0.log" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.235614 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-f95vk_3821e364-991e-4a58-88e6-cf499d12aa70/manager/1.log" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.510600 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-52z7q_45ef8022-adf2-46bc-a112-a5532880c080/manager/1.log" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.568250 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-f95vk_3821e364-991e-4a58-88e6-cf499d12aa70/manager/0.log" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.678695 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-52z7q_45ef8022-adf2-46bc-a112-a5532880c080/manager/0.log" Mar 18 14:31:07 crc kubenswrapper[4912]: I0318 14:31:07.912313 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s_7ffd183f-20a4-4586-ac75-597797ada23c/manager/1.log" Mar 18 14:31:08 crc kubenswrapper[4912]: I0318 14:31:08.002777 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-ntn4s_7ffd183f-20a4-4586-ac75-597797ada23c/manager/0.log" Mar 18 14:31:08 crc kubenswrapper[4912]: I0318 14:31:08.349537 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57c55bf5f4-gflkq_ef041eab-e584-4a2a-8008-9a7f07f75f70/operator/0.log" Mar 18 14:31:08 crc kubenswrapper[4912]: I0318 14:31:08.953849 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tkt7x_10c9b954-d1cb-4055-a082-5b06828b5faa/registry-server/1.log" Mar 18 14:31:09 crc kubenswrapper[4912]: I0318 14:31:09.499618 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tkt7x_10c9b954-d1cb-4055-a082-5b06828b5faa/registry-server/0.log" Mar 18 14:31:09 crc kubenswrapper[4912]: I0318 14:31:09.867190 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-drnxt_e5f93e56-4ca9-413c-9954-f94f182b6606/manager/1.log" Mar 18 14:31:09 crc kubenswrapper[4912]: I0318 14:31:09.956744 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-drnxt_e5f93e56-4ca9-413c-9954-f94f182b6606/manager/0.log" Mar 18 14:31:10 crc kubenswrapper[4912]: I0318 14:31:10.274213 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-9kt49_67ab4d42-cf77-45ce-9bf7-f0db056c4151/manager/1.log" Mar 18 14:31:10 crc kubenswrapper[4912]: I0318 14:31:10.427025 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-9kt49_67ab4d42-cf77-45ce-9bf7-f0db056c4151/manager/0.log" Mar 18 14:31:10 crc kubenswrapper[4912]: I0318 14:31:10.852875 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pwv5l_e00a6814-84ad-42fc-a5c5-b629750cfa80/operator/0.log" Mar 18 14:31:11 crc kubenswrapper[4912]: I0318 14:31:11.127394 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-cllp9_2c6e1e3e-7303-42ab-ac5d-fa2bc2f648e2/manager/0.log" Mar 18 14:31:11 crc kubenswrapper[4912]: I0318 14:31:11.464459 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-585bd669c7-vrxh8_d96a656e-5436-4af3-b4cd-98c485c402a1/manager/0.log" Mar 18 14:31:11 crc kubenswrapper[4912]: I0318 14:31:11.910828 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-5wn69_13092522-58a7-4c49-9164-41523060735e/manager/1.log" Mar 18 14:31:12 crc kubenswrapper[4912]: I0318 14:31:12.172671 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-6bzx6_692fb335-57d8-465c-b7ef-d94c53f84523/manager/1.log" Mar 18 14:31:12 crc kubenswrapper[4912]: I0318 14:31:12.313955 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-6bzx6_692fb335-57d8-465c-b7ef-d94c53f84523/manager/0.log" Mar 18 14:31:12 crc kubenswrapper[4912]: I0318 14:31:12.499540 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-vgvxb_19eca4e0-1677-4af5-993a-4cd45173287e/manager/0.log" Mar 18 14:31:12 crc kubenswrapper[4912]: I0318 14:31:12.604994 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-54d55b7b75-h9lqp_35ae7eba-4b8f-43ac-b828-5cbc84fed044/manager/0.log" Mar 18 14:31:18 crc kubenswrapper[4912]: I0318 14:31:18.190912 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-5wn69_13092522-58a7-4c49-9164-41523060735e/manager/0.log" Mar 18 14:31:37 crc kubenswrapper[4912]: I0318 14:31:36.999420 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:31:37 crc kubenswrapper[4912]: I0318 14:31:37.000340 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:31:40 crc kubenswrapper[4912]: I0318 14:31:40.462935 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rb9sv_de010a28-87e3-4340-87fc-9242ad95647a/control-plane-machine-set-operator/0.log" Mar 18 14:31:40 crc kubenswrapper[4912]: I0318 14:31:40.702398 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7cw4z_ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e/kube-rbac-proxy/0.log" Mar 18 14:31:40 crc kubenswrapper[4912]: I0318 14:31:40.802423 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7cw4z_ff1b19e7-d417-4b14-92ad-e2ad64c1fa1e/machine-api-operator/0.log" Mar 18 14:31:57 crc kubenswrapper[4912]: I0318 14:31:57.830280 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-94ssf_fab70011-1512-4414-9319-247cf2ccd2b2/cert-manager-controller/0.log" Mar 18 14:31:58 crc kubenswrapper[4912]: I0318 14:31:58.247680 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n7gmv_0dcabbbc-e386-4bcf-9fc6-51e388ad3d36/cert-manager-cainjector/0.log" Mar 18 14:31:58 crc kubenswrapper[4912]: I0318 14:31:58.396458 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cxrkp_c672e269-a0f9-42e0-964c-ea26f3d86a58/cert-manager-webhook/0.log" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.170629 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n94qm"] Mar 18 14:32:00 crc kubenswrapper[4912]: E0318 14:32:00.171749 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c1e3c5-c136-42cb-a0f6-4067a05a67c4" containerName="oc" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.171766 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c1e3c5-c136-42cb-a0f6-4067a05a67c4" containerName="oc" Mar 18 14:32:00 crc kubenswrapper[4912]: E0318 14:32:00.171807 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b531f4d4-ef53-4c22-bdd9-91fedb2f4971" containerName="collect-profiles" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.171814 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="b531f4d4-ef53-4c22-bdd9-91fedb2f4971" containerName="collect-profiles" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.172090 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="b531f4d4-ef53-4c22-bdd9-91fedb2f4971" containerName="collect-profiles" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.172135 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c1e3c5-c136-42cb-a0f6-4067a05a67c4" containerName="oc" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.173075 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.176364 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.176434 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.176370 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.196126 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n94qm"] Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.318949 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7l8\" (UniqueName: \"kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8\") pod \"auto-csr-approver-29564072-n94qm\" (UID: \"20d84325-2fba-463a-9c88-48eb40e0e43e\") " pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.421105 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7l8\" (UniqueName: \"kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8\") pod \"auto-csr-approver-29564072-n94qm\" (UID: \"20d84325-2fba-463a-9c88-48eb40e0e43e\") " pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.454134 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7l8\" (UniqueName: \"kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8\") pod \"auto-csr-approver-29564072-n94qm\" (UID: \"20d84325-2fba-463a-9c88-48eb40e0e43e\") " pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:00 crc kubenswrapper[4912]: I0318 14:32:00.500718 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:01 crc kubenswrapper[4912]: I0318 14:32:01.255377 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n94qm"] Mar 18 14:32:01 crc kubenswrapper[4912]: I0318 14:32:01.945766 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n94qm" event={"ID":"20d84325-2fba-463a-9c88-48eb40e0e43e","Type":"ContainerStarted","Data":"524011caf69cdc7907262144cba9e8cd4f37ce42b029fb3018cccf1ace1a4a97"} Mar 18 14:32:02 crc kubenswrapper[4912]: I0318 14:32:02.961604 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n94qm" event={"ID":"20d84325-2fba-463a-9c88-48eb40e0e43e","Type":"ContainerStarted","Data":"4f469a18c9d96bfb6f51ea966f431008006fdf353872583bae36a986335d18e2"} Mar 18 14:32:02 crc kubenswrapper[4912]: I0318 14:32:02.986873 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564072-n94qm" podStartSLOduration=1.872219471 podStartE2EDuration="2.986844492s" podCreationTimestamp="2026-03-18 14:32:00 +0000 UTC" firstStartedPulling="2026-03-18 14:32:01.27480997 +0000 UTC m=+5369.734237395" lastFinishedPulling="2026-03-18 14:32:02.389434991 +0000 UTC m=+5370.848862416" observedRunningTime="2026-03-18 14:32:02.97903095 +0000 UTC m=+5371.438458395" watchObservedRunningTime="2026-03-18 14:32:02.986844492 +0000 UTC m=+5371.446271917" Mar 18 14:32:04 crc kubenswrapper[4912]: I0318 14:32:04.993275 4912 generic.go:334] "Generic (PLEG): container finished" podID="20d84325-2fba-463a-9c88-48eb40e0e43e" containerID="4f469a18c9d96bfb6f51ea966f431008006fdf353872583bae36a986335d18e2" exitCode=0 Mar 18 14:32:04 crc kubenswrapper[4912]: I0318 14:32:04.993381 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n94qm" event={"ID":"20d84325-2fba-463a-9c88-48eb40e0e43e","Type":"ContainerDied","Data":"4f469a18c9d96bfb6f51ea966f431008006fdf353872583bae36a986335d18e2"} Mar 18 14:32:06 crc kubenswrapper[4912]: I0318 14:32:06.999008 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:06.999813 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:06.999873 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.001018 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.001096 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" gracePeriod=600 Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.042999 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n94qm" event={"ID":"20d84325-2fba-463a-9c88-48eb40e0e43e","Type":"ContainerDied","Data":"524011caf69cdc7907262144cba9e8cd4f37ce42b029fb3018cccf1ace1a4a97"} Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.043076 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524011caf69cdc7907262144cba9e8cd4f37ce42b029fb3018cccf1ace1a4a97" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.120140 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:07 crc kubenswrapper[4912]: E0318 14:32:07.208206 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.209977 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp7l8\" (UniqueName: \"kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8\") pod \"20d84325-2fba-463a-9c88-48eb40e0e43e\" (UID: \"20d84325-2fba-463a-9c88-48eb40e0e43e\") " Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.218781 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8" (OuterVolumeSpecName: "kube-api-access-rp7l8") pod "20d84325-2fba-463a-9c88-48eb40e0e43e" (UID: "20d84325-2fba-463a-9c88-48eb40e0e43e"). InnerVolumeSpecName "kube-api-access-rp7l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:32:07 crc kubenswrapper[4912]: I0318 14:32:07.316448 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp7l8\" (UniqueName: \"kubernetes.io/projected/20d84325-2fba-463a-9c88-48eb40e0e43e-kube-api-access-rp7l8\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.064244 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" exitCode=0 Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.064324 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589"} Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.064817 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n94qm" Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.064840 4912 scope.go:117] "RemoveContainer" containerID="c5ced18fd2e5d09788e975802c82a54bc9a779ddaeae6d48f618506d7040a53a" Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.066125 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:32:08 crc kubenswrapper[4912]: E0318 14:32:08.066710 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.219775 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-lmtc2"] Mar 18 14:32:08 crc kubenswrapper[4912]: I0318 14:32:08.245305 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-lmtc2"] Mar 18 14:32:10 crc kubenswrapper[4912]: I0318 14:32:10.244418 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c16250-e6b3-4308-bf15-d4633c661d9e" path="/var/lib/kubelet/pods/54c16250-e6b3-4308-bf15-d4633c661d9e/volumes" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.088341 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-v7l52_a7f16942-c64f-46a0-84ad-a52844af0d08/nmstate-console-plugin/0.log" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.283184 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zszmc_23eaceef-5a11-4610-91b0-6ca3c42c167f/nmstate-handler/0.log" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.347533 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-775r6_e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf/kube-rbac-proxy/0.log" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.497095 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-775r6_e3aa2cf0-8cc6-4c4f-b192-c3bf7113f7bf/nmstate-metrics/0.log" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.553968 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-2ppb9_5343c7cb-068f-489d-be0f-a09ea457e71f/nmstate-operator/0.log" Mar 18 14:32:15 crc kubenswrapper[4912]: I0318 14:32:15.722748 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-jkd5w_3e51cc8b-d69c-4be9-8b12-c1a10c653621/nmstate-webhook/0.log" Mar 18 14:32:21 crc kubenswrapper[4912]: I0318 14:32:21.234088 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:32:21 crc kubenswrapper[4912]: E0318 14:32:21.235861 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:32:30 crc kubenswrapper[4912]: I0318 14:32:30.695982 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/kube-rbac-proxy/0.log" Mar 18 14:32:30 crc kubenswrapper[4912]: I0318 14:32:30.814835 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/manager/1.log" Mar 18 14:32:30 crc kubenswrapper[4912]: I0318 14:32:30.936258 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/manager/0.log" Mar 18 14:32:35 crc kubenswrapper[4912]: I0318 14:32:35.228258 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:32:35 crc kubenswrapper[4912]: E0318 14:32:35.229266 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:32:46 crc kubenswrapper[4912]: I0318 14:32:46.631362 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-2r5xf_37808f2f-08d5-432e-8ad6-69ad0b0e573a/prometheus-operator/0.log" Mar 18 14:32:46 crc kubenswrapper[4912]: I0318 14:32:46.863408 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d7879b9-66zt6_0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2/prometheus-operator-admission-webhook/0.log" Mar 18 14:32:46 crc kubenswrapper[4912]: I0318 14:32:46.898749 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d7879b9-flqsx_d4045f06-e567-4dda-8192-2dbef917a7a0/prometheus-operator-admission-webhook/0.log" Mar 18 14:32:47 crc kubenswrapper[4912]: I0318 14:32:47.089698 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-lcgrk_ffcc0a7f-efff-4a18-8002-7b33a557293c/operator/0.log" Mar 18 14:32:47 crc kubenswrapper[4912]: I0318 14:32:47.143063 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-w8j42_be9dbd3b-a78d-4306-b834-3cd7c60d7d05/observability-ui-dashboards/0.log" Mar 18 14:32:47 crc kubenswrapper[4912]: I0318 14:32:47.307401 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-7bb4554dcb-4hc2x_b1062176-da75-4c7d-a3fc-b5ecee790973/perses-operator/0.log" Mar 18 14:32:48 crc kubenswrapper[4912]: I0318 14:32:48.229843 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:32:48 crc kubenswrapper[4912]: E0318 14:32:48.230676 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:03 crc kubenswrapper[4912]: I0318 14:33:03.228719 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:33:03 crc kubenswrapper[4912]: E0318 14:33:03.230119 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:07 crc kubenswrapper[4912]: I0318 14:33:07.717185 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-9lr46_fb5bb2a5-d719-49dd-a9b3-8734f6944648/cluster-logging-operator/0.log" Mar 18 14:33:07 crc kubenswrapper[4912]: I0318 14:33:07.718006 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-zx7v8_7853aab4-c5b8-400c-9b11-80102982ddd3/collector/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.012647 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_78d0ba71-aecb-4e22-a459-c5f690268e0e/loki-compactor/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.032565 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-s2ztv_cd78a5ca-41b5-48af-a603-0ac01cbde069/loki-distributor/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.261569 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-b5bdf65c4-ldbjt_22169096-dc0c-47ea-a40e-728cac38c1d4/gateway/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.313541 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-b5bdf65c4-ldbjt_22169096-dc0c-47ea-a40e-728cac38c1d4/opa/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.466856 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-b5bdf65c4-vqfpz_86abc7c8-2019-4f25-84a0-f764bc3f10d6/gateway/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.502965 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-b5bdf65c4-vqfpz_86abc7c8-2019-4f25-84a0-f764bc3f10d6/opa/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.621678 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_5c4fd206-4176-47ef-9cee-8be6e9ed396f/loki-index-gateway/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.837424 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_9951708d-b5a5-4dea-9cb5-a89c96f2a404/loki-ingester/0.log" Mar 18 14:33:08 crc kubenswrapper[4912]: I0318 14:33:08.922256 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-jgbgh_d49e1c94-aaf7-4502-ad56-46296a08cf03/loki-querier/0.log" Mar 18 14:33:09 crc kubenswrapper[4912]: I0318 14:33:09.060728 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-tcbf4_95f374dc-f34c-48df-a280-3434f082b6d0/loki-query-frontend/0.log" Mar 18 14:33:14 crc kubenswrapper[4912]: I0318 14:33:14.228789 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:33:14 crc kubenswrapper[4912]: E0318 14:33:14.230246 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:16 crc kubenswrapper[4912]: I0318 14:33:16.938413 4912 scope.go:117] "RemoveContainer" containerID="bc60b4e43daef71b12850509b354819220cb4d76af8af7327ab323aa87475de8" Mar 18 14:33:26 crc kubenswrapper[4912]: I0318 14:33:26.344527 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tpm8v_1c9a2194-27ba-4a86-b5c1-e8356c71227f/kube-rbac-proxy/0.log" Mar 18 14:33:26 crc kubenswrapper[4912]: I0318 14:33:26.634946 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tpm8v_1c9a2194-27ba-4a86-b5c1-e8356c71227f/controller/0.log" Mar 18 14:33:26 crc kubenswrapper[4912]: I0318 14:33:26.676377 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-frr-files/0.log" Mar 18 14:33:26 crc kubenswrapper[4912]: I0318 14:33:26.952696 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-frr-files/0.log" Mar 18 14:33:26 crc kubenswrapper[4912]: I0318 14:33:26.992158 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-reloader/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.014723 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-metrics/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.033157 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-reloader/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.304847 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-reloader/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.309484 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-frr-files/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.327308 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-metrics/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.362789 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-metrics/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.739276 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/controller/1.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.782990 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-reloader/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.783071 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-metrics/0.log" Mar 18 14:33:27 crc kubenswrapper[4912]: I0318 14:33:27.784667 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/cp-frr-files/0.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.229315 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:33:28 crc kubenswrapper[4912]: E0318 14:33:28.229880 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.604640 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/frr-metrics/0.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.616288 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/controller/0.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.682141 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/frr/1.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.903137 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/kube-rbac-proxy-frr/0.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.937732 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/kube-rbac-proxy/0.log" Mar 18 14:33:28 crc kubenswrapper[4912]: I0318 14:33:28.995023 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/reloader/0.log" Mar 18 14:33:29 crc kubenswrapper[4912]: I0318 14:33:29.271436 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-grpwd_7d7516e2-d2c4-4f18-9cc6-d2aad94db27e/frr-k8s-webhook-server/1.log" Mar 18 14:33:29 crc kubenswrapper[4912]: I0318 14:33:29.391199 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-grpwd_7d7516e2-d2c4-4f18-9cc6-d2aad94db27e/frr-k8s-webhook-server/0.log" Mar 18 14:33:29 crc kubenswrapper[4912]: I0318 14:33:29.547925 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-666765756d-v7mtx_9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4/manager/1.log" Mar 18 14:33:29 crc kubenswrapper[4912]: I0318 14:33:29.744378 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-666765756d-v7mtx_9b2b99dd-aba6-40fe-bdc8-a15f6b9815c4/manager/0.log" Mar 18 14:33:29 crc kubenswrapper[4912]: I0318 14:33:29.882526 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54bbf46695-l6jq5_9ead324e-7891-4059-9d70-90462b2cc852/webhook-server/1.log" Mar 18 14:33:30 crc kubenswrapper[4912]: I0318 14:33:30.149200 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54bbf46695-l6jq5_9ead324e-7891-4059-9d70-90462b2cc852/webhook-server/0.log" Mar 18 14:33:30 crc kubenswrapper[4912]: I0318 14:33:30.191463 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zw7wz_c6af6424-58bd-4c40-a86c-15627b762a9a/kube-rbac-proxy/0.log" Mar 18 14:33:30 crc kubenswrapper[4912]: I0318 14:33:30.885138 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zw7wz_c6af6424-58bd-4c40-a86c-15627b762a9a/speaker/1.log" Mar 18 14:33:31 crc kubenswrapper[4912]: I0318 14:33:31.063148 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ngzqk_0475f7b9-387c-422d-88c8-90416895b720/frr/0.log" Mar 18 14:33:31 crc kubenswrapper[4912]: I0318 14:33:31.259569 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zw7wz_c6af6424-58bd-4c40-a86c-15627b762a9a/speaker/0.log" Mar 18 14:33:41 crc kubenswrapper[4912]: I0318 14:33:41.227821 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:33:41 crc kubenswrapper[4912]: E0318 14:33:41.229407 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.357673 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/util/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.579433 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/pull/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.625013 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/util/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.644643 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/pull/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.866938 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/util/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.920401 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/pull/0.log" Mar 18 14:33:47 crc kubenswrapper[4912]: I0318 14:33:47.963556 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rsqcm_3252c88f-2452-4b5a-9ebc-4410c8c9f822/extract/0.log" Mar 18 14:33:48 crc kubenswrapper[4912]: I0318 14:33:48.119761 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/util/0.log" Mar 18 14:33:48 crc kubenswrapper[4912]: I0318 14:33:48.315368 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/pull/0.log" Mar 18 14:33:48 crc kubenswrapper[4912]: I0318 14:33:48.370986 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/util/0.log" Mar 18 14:33:48 crc kubenswrapper[4912]: I0318 14:33:48.442219 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/pull/0.log" Mar 18 14:33:48 crc kubenswrapper[4912]: I0318 14:33:48.874714 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/util/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.016559 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/pull/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.023659 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sw55c_9d7017e2-48e0-4868-b763-4186e771faae/extract/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.257604 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/util/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.477644 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/util/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.503198 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/pull/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.591586 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/pull/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.804961 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/extract/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.831731 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/pull/0.log" Mar 18 14:33:49 crc kubenswrapper[4912]: I0318 14:33:49.834024 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d58fwb9_795cc23a-a174-4e21-8a01-6f631f937583/util/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.060999 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/util/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.353960 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/util/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.383780 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/pull/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.383947 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/pull/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.648149 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/util/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.649429 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/extract/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.672223 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cg6969_f5b8d99c-4d5f-4c25-a12b-3d45756fced8/pull/0.log" Mar 18 14:33:50 crc kubenswrapper[4912]: I0318 14:33:50.884293 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/util/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.129650 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/util/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.148507 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/pull/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.167215 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/pull/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.740875 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/util/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.798014 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/extract/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.810792 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397267722g_d97da3f2-c9b0-42d1-a5ea-795994d0f5cb/pull/0.log" Mar 18 14:33:51 crc kubenswrapper[4912]: I0318 14:33:51.997432 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-utilities/0.log" Mar 18 14:33:52 crc kubenswrapper[4912]: I0318 14:33:52.239103 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:33:52 crc kubenswrapper[4912]: E0318 14:33:52.239383 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:33:52 crc kubenswrapper[4912]: I0318 14:33:52.943905 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-utilities/0.log" Mar 18 14:33:52 crc kubenswrapper[4912]: I0318 14:33:52.951720 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-content/0.log" Mar 18 14:33:52 crc kubenswrapper[4912]: I0318 14:33:52.956531 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-content/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.258122 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-content/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.316800 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/extract-utilities/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.399366 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-utilities/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.617395 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-utilities/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.649745 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-content/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.721080 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-content/0.log" Mar 18 14:33:53 crc kubenswrapper[4912]: I0318 14:33:53.979242 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-utilities/0.log" Mar 18 14:33:54 crc kubenswrapper[4912]: I0318 14:33:54.126695 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/extract-content/0.log" Mar 18 14:33:54 crc kubenswrapper[4912]: I0318 14:33:54.312425 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nw2vt_70dff85c-f45b-431d-83ad-3b7802b15cd3/marketplace-operator/0.log" Mar 18 14:33:54 crc kubenswrapper[4912]: I0318 14:33:54.412849 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sfv6d_4375d78c-761e-4691-9da9-89f56373ea76/registry-server/0.log" Mar 18 14:33:54 crc kubenswrapper[4912]: I0318 14:33:54.488162 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-utilities/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.230743 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6mtn_be01ffc1-29df-445f-b0e7-6dd0e80c6297/registry-server/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.416733 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-utilities/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.432828 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-content/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.465580 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-content/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.704279 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-content/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.792484 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/extract-utilities/0.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.860419 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/registry-server/1.log" Mar 18 14:33:55 crc kubenswrapper[4912]: I0318 14:33:55.873463 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-98dk7_73cfec7d-c7e6-4beb-9a85-f161c2c7c31a/registry-server/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.009200 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-utilities/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.600441 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-utilities/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.632075 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-content/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.665881 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-content/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.826858 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-utilities/0.log" Mar 18 14:33:56 crc kubenswrapper[4912]: I0318 14:33:56.869190 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/extract-content/0.log" Mar 18 14:33:57 crc kubenswrapper[4912]: I0318 14:33:57.689932 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pr4zx_b5944127-745d-42f9-83c2-d448435da4c9/registry-server/0.log" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.149023 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564074-97bwz"] Mar 18 14:34:00 crc kubenswrapper[4912]: E0318 14:34:00.150694 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d84325-2fba-463a-9c88-48eb40e0e43e" containerName="oc" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.150713 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d84325-2fba-463a-9c88-48eb40e0e43e" containerName="oc" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.151114 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d84325-2fba-463a-9c88-48eb40e0e43e" containerName="oc" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.153714 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.155830 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.155968 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.156215 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.161538 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-97bwz"] Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.234941 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmx2r\" (UniqueName: \"kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r\") pod \"auto-csr-approver-29564074-97bwz\" (UID: \"8f6268ac-bc25-4cf3-a635-9aa2412b10aa\") " pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.338670 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmx2r\" (UniqueName: \"kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r\") pod \"auto-csr-approver-29564074-97bwz\" (UID: \"8f6268ac-bc25-4cf3-a635-9aa2412b10aa\") " pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.359275 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmx2r\" (UniqueName: \"kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r\") pod \"auto-csr-approver-29564074-97bwz\" (UID: \"8f6268ac-bc25-4cf3-a635-9aa2412b10aa\") " pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.481639 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.993336 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:34:00 crc kubenswrapper[4912]: I0318 14:34:00.999757 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-97bwz"] Mar 18 14:34:01 crc kubenswrapper[4912]: I0318 14:34:01.800073 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-97bwz" event={"ID":"8f6268ac-bc25-4cf3-a635-9aa2412b10aa","Type":"ContainerStarted","Data":"df237532febbac03ab7c0fe3d48ecb4225850a844f506c40205b48a520e832ab"} Mar 18 14:34:03 crc kubenswrapper[4912]: I0318 14:34:03.830846 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-97bwz" event={"ID":"8f6268ac-bc25-4cf3-a635-9aa2412b10aa","Type":"ContainerStarted","Data":"30e82b8ced5effa764d511bff534922a40bcd0d146ac7e59699a64e9e6020494"} Mar 18 14:34:03 crc kubenswrapper[4912]: I0318 14:34:03.861279 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564074-97bwz" podStartSLOduration=1.7109195590000001 podStartE2EDuration="3.861245963s" podCreationTimestamp="2026-03-18 14:34:00 +0000 UTC" firstStartedPulling="2026-03-18 14:34:00.993109579 +0000 UTC m=+5489.452537004" lastFinishedPulling="2026-03-18 14:34:03.143435983 +0000 UTC m=+5491.602863408" observedRunningTime="2026-03-18 14:34:03.85338203 +0000 UTC m=+5492.312809475" watchObservedRunningTime="2026-03-18 14:34:03.861245963 +0000 UTC m=+5492.320673398" Mar 18 14:34:04 crc kubenswrapper[4912]: I0318 14:34:04.851467 4912 generic.go:334] "Generic (PLEG): container finished" podID="8f6268ac-bc25-4cf3-a635-9aa2412b10aa" containerID="30e82b8ced5effa764d511bff534922a40bcd0d146ac7e59699a64e9e6020494" exitCode=0 Mar 18 14:34:04 crc kubenswrapper[4912]: I0318 14:34:04.851555 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-97bwz" event={"ID":"8f6268ac-bc25-4cf3-a635-9aa2412b10aa","Type":"ContainerDied","Data":"30e82b8ced5effa764d511bff534922a40bcd0d146ac7e59699a64e9e6020494"} Mar 18 14:34:05 crc kubenswrapper[4912]: I0318 14:34:05.228631 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:34:05 crc kubenswrapper[4912]: E0318 14:34:05.229052 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.320761 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.414109 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmx2r\" (UniqueName: \"kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r\") pod \"8f6268ac-bc25-4cf3-a635-9aa2412b10aa\" (UID: \"8f6268ac-bc25-4cf3-a635-9aa2412b10aa\") " Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.442119 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r" (OuterVolumeSpecName: "kube-api-access-tmx2r") pod "8f6268ac-bc25-4cf3-a635-9aa2412b10aa" (UID: "8f6268ac-bc25-4cf3-a635-9aa2412b10aa"). InnerVolumeSpecName "kube-api-access-tmx2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.518220 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmx2r\" (UniqueName: \"kubernetes.io/projected/8f6268ac-bc25-4cf3-a635-9aa2412b10aa-kube-api-access-tmx2r\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.884334 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-97bwz" event={"ID":"8f6268ac-bc25-4cf3-a635-9aa2412b10aa","Type":"ContainerDied","Data":"df237532febbac03ab7c0fe3d48ecb4225850a844f506c40205b48a520e832ab"} Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.884393 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df237532febbac03ab7c0fe3d48ecb4225850a844f506c40205b48a520e832ab" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.884486 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-97bwz" Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.969465 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-2vlzq"] Mar 18 14:34:06 crc kubenswrapper[4912]: I0318 14:34:06.982946 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-2vlzq"] Mar 18 14:34:08 crc kubenswrapper[4912]: I0318 14:34:08.246204 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81eb1334-0249-47b0-a348-570af03963fd" path="/var/lib/kubelet/pods/81eb1334-0249-47b0-a348-570af03963fd/volumes" Mar 18 14:34:14 crc kubenswrapper[4912]: I0318 14:34:14.792905 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-2r5xf_37808f2f-08d5-432e-8ad6-69ad0b0e573a/prometheus-operator/0.log" Mar 18 14:34:14 crc kubenswrapper[4912]: I0318 14:34:14.836639 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d7879b9-66zt6_0c4d6abc-d5f8-4ce0-bb8e-eff94cd9bdf2/prometheus-operator-admission-webhook/0.log" Mar 18 14:34:14 crc kubenswrapper[4912]: I0318 14:34:14.847293 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d7879b9-flqsx_d4045f06-e567-4dda-8192-2dbef917a7a0/prometheus-operator-admission-webhook/0.log" Mar 18 14:34:15 crc kubenswrapper[4912]: I0318 14:34:15.038198 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-lcgrk_ffcc0a7f-efff-4a18-8002-7b33a557293c/operator/0.log" Mar 18 14:34:15 crc kubenswrapper[4912]: I0318 14:34:15.107488 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-7bb4554dcb-4hc2x_b1062176-da75-4c7d-a3fc-b5ecee790973/perses-operator/0.log" Mar 18 14:34:15 crc kubenswrapper[4912]: I0318 14:34:15.130870 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-w8j42_be9dbd3b-a78d-4306-b834-3cd7c60d7d05/observability-ui-dashboards/0.log" Mar 18 14:34:17 crc kubenswrapper[4912]: I0318 14:34:17.051657 4912 scope.go:117] "RemoveContainer" containerID="2033d1c93650c3b613c32a247af0aefa05b6f191190136fc2b92906b83b2f9d9" Mar 18 14:34:19 crc kubenswrapper[4912]: I0318 14:34:19.229026 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:34:19 crc kubenswrapper[4912]: E0318 14:34:19.233114 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:34:34 crc kubenswrapper[4912]: I0318 14:34:34.228635 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:34:34 crc kubenswrapper[4912]: E0318 14:34:34.230092 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:34:35 crc kubenswrapper[4912]: I0318 14:34:35.420284 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/manager/0.log" Mar 18 14:34:35 crc kubenswrapper[4912]: I0318 14:34:35.422916 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/manager/1.log" Mar 18 14:34:35 crc kubenswrapper[4912]: I0318 14:34:35.473190 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-867987c6b7-jg2ct_8efdcb68-92df-434c-8446-5be1ef0a94ba/kube-rbac-proxy/0.log" Mar 18 14:34:45 crc kubenswrapper[4912]: I0318 14:34:45.228660 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:34:45 crc kubenswrapper[4912]: E0318 14:34:45.229745 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:34:58 crc kubenswrapper[4912]: I0318 14:34:58.228410 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:34:58 crc kubenswrapper[4912]: E0318 14:34:58.229528 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:35:10 crc kubenswrapper[4912]: I0318 14:35:10.229685 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:35:10 crc kubenswrapper[4912]: E0318 14:35:10.230578 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:35:17 crc kubenswrapper[4912]: I0318 14:35:17.186421 4912 scope.go:117] "RemoveContainer" containerID="621ae32aaef30236fd469b758aba279af9bdd5cb4963a9cabc8231b9d9623150" Mar 18 14:35:23 crc kubenswrapper[4912]: I0318 14:35:23.230122 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:35:23 crc kubenswrapper[4912]: E0318 14:35:23.230890 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:35:34 crc kubenswrapper[4912]: I0318 14:35:34.229120 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:35:34 crc kubenswrapper[4912]: E0318 14:35:34.230398 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:35:47 crc kubenswrapper[4912]: I0318 14:35:47.228138 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:35:47 crc kubenswrapper[4912]: E0318 14:35:47.229151 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:35:58 crc kubenswrapper[4912]: I0318 14:35:58.229010 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:35:58 crc kubenswrapper[4912]: E0318 14:35:58.230195 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.161498 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564076-vcw5x"] Mar 18 14:36:00 crc kubenswrapper[4912]: E0318 14:36:00.163152 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6268ac-bc25-4cf3-a635-9aa2412b10aa" containerName="oc" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.163171 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6268ac-bc25-4cf3-a635-9aa2412b10aa" containerName="oc" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.163490 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6268ac-bc25-4cf3-a635-9aa2412b10aa" containerName="oc" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.164584 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.167800 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.168532 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.170728 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.178766 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-vcw5x"] Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.207290 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79xj\" (UniqueName: \"kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj\") pod \"auto-csr-approver-29564076-vcw5x\" (UID: \"4c215a29-03d9-4b1c-9057-afae168546e1\") " pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.313865 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79xj\" (UniqueName: \"kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj\") pod \"auto-csr-approver-29564076-vcw5x\" (UID: \"4c215a29-03d9-4b1c-9057-afae168546e1\") " pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.341534 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79xj\" (UniqueName: \"kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj\") pod \"auto-csr-approver-29564076-vcw5x\" (UID: \"4c215a29-03d9-4b1c-9057-afae168546e1\") " pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:00 crc kubenswrapper[4912]: I0318 14:36:00.495695 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:01 crc kubenswrapper[4912]: I0318 14:36:01.316818 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-vcw5x"] Mar 18 14:36:01 crc kubenswrapper[4912]: W0318 14:36:01.325938 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c215a29_03d9_4b1c_9057_afae168546e1.slice/crio-6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91 WatchSource:0}: Error finding container 6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91: Status 404 returned error can't find the container with id 6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91 Mar 18 14:36:01 crc kubenswrapper[4912]: I0318 14:36:01.567405 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" event={"ID":"4c215a29-03d9-4b1c-9057-afae168546e1","Type":"ContainerStarted","Data":"6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91"} Mar 18 14:36:03 crc kubenswrapper[4912]: I0318 14:36:03.590881 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" event={"ID":"4c215a29-03d9-4b1c-9057-afae168546e1","Type":"ContainerStarted","Data":"42f881605576ba27a2c6bf238be35b82e2e6e28a5bf940c80b1fdc666bf25f39"} Mar 18 14:36:03 crc kubenswrapper[4912]: I0318 14:36:03.618392 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" podStartSLOduration=2.252994435 podStartE2EDuration="3.618371954s" podCreationTimestamp="2026-03-18 14:36:00 +0000 UTC" firstStartedPulling="2026-03-18 14:36:01.328209443 +0000 UTC m=+5609.787636868" lastFinishedPulling="2026-03-18 14:36:02.693586962 +0000 UTC m=+5611.153014387" observedRunningTime="2026-03-18 14:36:03.611154548 +0000 UTC m=+5612.070581983" watchObservedRunningTime="2026-03-18 14:36:03.618371954 +0000 UTC m=+5612.077799379" Mar 18 14:36:04 crc kubenswrapper[4912]: I0318 14:36:04.614724 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" event={"ID":"4c215a29-03d9-4b1c-9057-afae168546e1","Type":"ContainerDied","Data":"42f881605576ba27a2c6bf238be35b82e2e6e28a5bf940c80b1fdc666bf25f39"} Mar 18 14:36:04 crc kubenswrapper[4912]: I0318 14:36:04.614659 4912 generic.go:334] "Generic (PLEG): container finished" podID="4c215a29-03d9-4b1c-9057-afae168546e1" containerID="42f881605576ba27a2c6bf238be35b82e2e6e28a5bf940c80b1fdc666bf25f39" exitCode=0 Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.123775 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.210967 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x79xj\" (UniqueName: \"kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj\") pod \"4c215a29-03d9-4b1c-9057-afae168546e1\" (UID: \"4c215a29-03d9-4b1c-9057-afae168546e1\") " Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.219898 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj" (OuterVolumeSpecName: "kube-api-access-x79xj") pod "4c215a29-03d9-4b1c-9057-afae168546e1" (UID: "4c215a29-03d9-4b1c-9057-afae168546e1"). InnerVolumeSpecName "kube-api-access-x79xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.316071 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x79xj\" (UniqueName: \"kubernetes.io/projected/4c215a29-03d9-4b1c-9057-afae168546e1-kube-api-access-x79xj\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.658648 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" event={"ID":"4c215a29-03d9-4b1c-9057-afae168546e1","Type":"ContainerDied","Data":"6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91"} Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.658713 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6443a960557e1366ec540a4ff8dee9bbaefde212b2cb7625d8b76cd5cfc19c91" Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.658791 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-vcw5x" Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.781432 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-qvgzj"] Mar 18 14:36:06 crc kubenswrapper[4912]: I0318 14:36:06.810866 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-qvgzj"] Mar 18 14:36:08 crc kubenswrapper[4912]: I0318 14:36:08.246998 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c1e3c5-c136-42cb-a0f6-4067a05a67c4" path="/var/lib/kubelet/pods/49c1e3c5-c136-42cb-a0f6-4067a05a67c4/volumes" Mar 18 14:36:09 crc kubenswrapper[4912]: I0318 14:36:09.228124 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:36:09 crc kubenswrapper[4912]: E0318 14:36:09.229115 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.609533 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:16 crc kubenswrapper[4912]: E0318 14:36:16.611241 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c215a29-03d9-4b1c-9057-afae168546e1" containerName="oc" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.611263 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c215a29-03d9-4b1c-9057-afae168546e1" containerName="oc" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.611593 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c215a29-03d9-4b1c-9057-afae168546e1" containerName="oc" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.615482 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.651106 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.651307 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.651407 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjhc\" (UniqueName: \"kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.655121 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.754831 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.755785 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.756844 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.757606 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.758137 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjhc\" (UniqueName: \"kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.783774 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjhc\" (UniqueName: \"kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc\") pod \"redhat-marketplace-bl9qg\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:16 crc kubenswrapper[4912]: I0318 14:36:16.938622 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:17 crc kubenswrapper[4912]: I0318 14:36:17.305367 4912 scope.go:117] "RemoveContainer" containerID="4ca705a1debba8c103772ef1f6f76ca88a83286964454a2e5d9d0911bf21cf57" Mar 18 14:36:17 crc kubenswrapper[4912]: I0318 14:36:17.362204 4912 scope.go:117] "RemoveContainer" containerID="11ee896c8d4d04e24db8c51e305d28fdea4e637a223d6df842532d86fa322ab7" Mar 18 14:36:17 crc kubenswrapper[4912]: I0318 14:36:17.521953 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:17 crc kubenswrapper[4912]: W0318 14:36:17.555674 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21870927_2376_4e37_97b2_79cca93705b3.slice/crio-1340c85c7f717f3d89a169d215da7a7c3442ee4eef62826f1bd5786be08c96e6 WatchSource:0}: Error finding container 1340c85c7f717f3d89a169d215da7a7c3442ee4eef62826f1bd5786be08c96e6: Status 404 returned error can't find the container with id 1340c85c7f717f3d89a169d215da7a7c3442ee4eef62826f1bd5786be08c96e6 Mar 18 14:36:17 crc kubenswrapper[4912]: I0318 14:36:17.823277 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerStarted","Data":"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80"} Mar 18 14:36:17 crc kubenswrapper[4912]: I0318 14:36:17.823666 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerStarted","Data":"1340c85c7f717f3d89a169d215da7a7c3442ee4eef62826f1bd5786be08c96e6"} Mar 18 14:36:18 crc kubenswrapper[4912]: I0318 14:36:18.844529 4912 generic.go:334] "Generic (PLEG): container finished" podID="21870927-2376-4e37-97b2-79cca93705b3" containerID="aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80" exitCode=0 Mar 18 14:36:18 crc kubenswrapper[4912]: I0318 14:36:18.844698 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerDied","Data":"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80"} Mar 18 14:36:19 crc kubenswrapper[4912]: I0318 14:36:19.861461 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerStarted","Data":"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a"} Mar 18 14:36:21 crc kubenswrapper[4912]: I0318 14:36:21.914392 4912 generic.go:334] "Generic (PLEG): container finished" podID="21870927-2376-4e37-97b2-79cca93705b3" containerID="02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a" exitCode=0 Mar 18 14:36:21 crc kubenswrapper[4912]: I0318 14:36:21.915196 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerDied","Data":"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a"} Mar 18 14:36:22 crc kubenswrapper[4912]: I0318 14:36:22.241327 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:36:22 crc kubenswrapper[4912]: E0318 14:36:22.241776 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:36:22 crc kubenswrapper[4912]: I0318 14:36:22.929708 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerStarted","Data":"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c"} Mar 18 14:36:22 crc kubenswrapper[4912]: I0318 14:36:22.969961 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bl9qg" podStartSLOduration=3.411699184 podStartE2EDuration="6.969938107s" podCreationTimestamp="2026-03-18 14:36:16 +0000 UTC" firstStartedPulling="2026-03-18 14:36:18.848874379 +0000 UTC m=+5627.308301804" lastFinishedPulling="2026-03-18 14:36:22.407113302 +0000 UTC m=+5630.866540727" observedRunningTime="2026-03-18 14:36:22.968232381 +0000 UTC m=+5631.427659806" watchObservedRunningTime="2026-03-18 14:36:22.969938107 +0000 UTC m=+5631.429365532" Mar 18 14:36:26 crc kubenswrapper[4912]: I0318 14:36:26.940508 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:26 crc kubenswrapper[4912]: I0318 14:36:26.941410 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:27 crc kubenswrapper[4912]: I0318 14:36:27.147443 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:27 crc kubenswrapper[4912]: I0318 14:36:27.224138 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:27 crc kubenswrapper[4912]: I0318 14:36:27.406917 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.049356 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bl9qg" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="registry-server" containerID="cri-o://d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c" gracePeriod=2 Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.636382 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.759487 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content\") pod \"21870927-2376-4e37-97b2-79cca93705b3\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.759646 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities\") pod \"21870927-2376-4e37-97b2-79cca93705b3\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.759759 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdjhc\" (UniqueName: \"kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc\") pod \"21870927-2376-4e37-97b2-79cca93705b3\" (UID: \"21870927-2376-4e37-97b2-79cca93705b3\") " Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.761173 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities" (OuterVolumeSpecName: "utilities") pod "21870927-2376-4e37-97b2-79cca93705b3" (UID: "21870927-2376-4e37-97b2-79cca93705b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.762389 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.778412 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc" (OuterVolumeSpecName: "kube-api-access-qdjhc") pod "21870927-2376-4e37-97b2-79cca93705b3" (UID: "21870927-2376-4e37-97b2-79cca93705b3"). InnerVolumeSpecName "kube-api-access-qdjhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.799805 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21870927-2376-4e37-97b2-79cca93705b3" (UID: "21870927-2376-4e37-97b2-79cca93705b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.865382 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21870927-2376-4e37-97b2-79cca93705b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:29 crc kubenswrapper[4912]: I0318 14:36:29.865418 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdjhc\" (UniqueName: \"kubernetes.io/projected/21870927-2376-4e37-97b2-79cca93705b3-kube-api-access-qdjhc\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.068161 4912 generic.go:334] "Generic (PLEG): container finished" podID="21870927-2376-4e37-97b2-79cca93705b3" containerID="d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c" exitCode=0 Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.068232 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerDied","Data":"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c"} Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.068320 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9qg" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.068347 4912 scope.go:117] "RemoveContainer" containerID="d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.068326 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9qg" event={"ID":"21870927-2376-4e37-97b2-79cca93705b3","Type":"ContainerDied","Data":"1340c85c7f717f3d89a169d215da7a7c3442ee4eef62826f1bd5786be08c96e6"} Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.116161 4912 scope.go:117] "RemoveContainer" containerID="02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.122296 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.134814 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9qg"] Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.178130 4912 scope.go:117] "RemoveContainer" containerID="aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.237717 4912 scope.go:117] "RemoveContainer" containerID="d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c" Mar 18 14:36:30 crc kubenswrapper[4912]: E0318 14:36:30.242549 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c\": container with ID starting with d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c not found: ID does not exist" containerID="d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.242611 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c"} err="failed to get container status \"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c\": rpc error: code = NotFound desc = could not find container \"d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c\": container with ID starting with d354feb2630f54b90475193e7fe18660538da676cd850133e91c1ff4cba5356c not found: ID does not exist" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.242642 4912 scope.go:117] "RemoveContainer" containerID="02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a" Mar 18 14:36:30 crc kubenswrapper[4912]: E0318 14:36:30.249943 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a\": container with ID starting with 02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a not found: ID does not exist" containerID="02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.250009 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a"} err="failed to get container status \"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a\": rpc error: code = NotFound desc = could not find container \"02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a\": container with ID starting with 02994525eeb94c3c3f096c4dac0b50788b5399439b184155d19be8a64431343a not found: ID does not exist" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.250066 4912 scope.go:117] "RemoveContainer" containerID="aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80" Mar 18 14:36:30 crc kubenswrapper[4912]: E0318 14:36:30.250919 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80\": container with ID starting with aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80 not found: ID does not exist" containerID="aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.250953 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80"} err="failed to get container status \"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80\": rpc error: code = NotFound desc = could not find container \"aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80\": container with ID starting with aefbdc6f58fb2d7460c3cee5b4a96a34289fc6532bfd4b99138e8a034e839d80 not found: ID does not exist" Mar 18 14:36:30 crc kubenswrapper[4912]: I0318 14:36:30.308403 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21870927-2376-4e37-97b2-79cca93705b3" path="/var/lib/kubelet/pods/21870927-2376-4e37-97b2-79cca93705b3/volumes" Mar 18 14:36:36 crc kubenswrapper[4912]: I0318 14:36:36.228176 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:36:36 crc kubenswrapper[4912]: E0318 14:36:36.229101 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.985216 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:36:41 crc kubenswrapper[4912]: E0318 14:36:41.986924 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="extract-content" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.986944 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="extract-content" Mar 18 14:36:41 crc kubenswrapper[4912]: E0318 14:36:41.986962 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="registry-server" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.986969 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="registry-server" Mar 18 14:36:41 crc kubenswrapper[4912]: E0318 14:36:41.986981 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="extract-utilities" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.986987 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="extract-utilities" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.987328 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="21870927-2376-4e37-97b2-79cca93705b3" containerName="registry-server" Mar 18 14:36:41 crc kubenswrapper[4912]: I0318 14:36:41.990453 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.016628 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.083693 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tjn\" (UniqueName: \"kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.083975 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.084304 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.186599 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tjn\" (UniqueName: \"kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.187189 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.187559 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.187954 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.188510 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.215019 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tjn\" (UniqueName: \"kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn\") pod \"certified-operators-ql4v6\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.348289 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:42 crc kubenswrapper[4912]: I0318 14:36:42.938369 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:36:43 crc kubenswrapper[4912]: I0318 14:36:43.324649 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerStarted","Data":"3ae15031eb9fb7b8ad02e873cb3c218699775f87c6050e25e0d824e98efc5e3c"} Mar 18 14:36:44 crc kubenswrapper[4912]: I0318 14:36:44.340380 4912 generic.go:334] "Generic (PLEG): container finished" podID="695f9961-1df1-4373-ac1a-05efed2736a0" containerID="2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2" exitCode=0 Mar 18 14:36:44 crc kubenswrapper[4912]: I0318 14:36:44.340927 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerDied","Data":"2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2"} Mar 18 14:36:46 crc kubenswrapper[4912]: I0318 14:36:46.384825 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerStarted","Data":"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776"} Mar 18 14:36:48 crc kubenswrapper[4912]: I0318 14:36:48.416668 4912 generic.go:334] "Generic (PLEG): container finished" podID="695f9961-1df1-4373-ac1a-05efed2736a0" containerID="f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776" exitCode=0 Mar 18 14:36:48 crc kubenswrapper[4912]: I0318 14:36:48.416733 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerDied","Data":"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776"} Mar 18 14:36:49 crc kubenswrapper[4912]: I0318 14:36:49.231326 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:36:49 crc kubenswrapper[4912]: E0318 14:36:49.232905 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:36:49 crc kubenswrapper[4912]: I0318 14:36:49.433867 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerStarted","Data":"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1"} Mar 18 14:36:49 crc kubenswrapper[4912]: I0318 14:36:49.463852 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ql4v6" podStartSLOduration=3.967905103 podStartE2EDuration="8.463830888s" podCreationTimestamp="2026-03-18 14:36:41 +0000 UTC" firstStartedPulling="2026-03-18 14:36:44.343706137 +0000 UTC m=+5652.803133562" lastFinishedPulling="2026-03-18 14:36:48.839631922 +0000 UTC m=+5657.299059347" observedRunningTime="2026-03-18 14:36:49.454917717 +0000 UTC m=+5657.914345162" watchObservedRunningTime="2026-03-18 14:36:49.463830888 +0000 UTC m=+5657.923258313" Mar 18 14:36:52 crc kubenswrapper[4912]: I0318 14:36:52.349439 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:52 crc kubenswrapper[4912]: I0318 14:36:52.350197 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:36:53 crc kubenswrapper[4912]: I0318 14:36:53.406391 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ql4v6" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="registry-server" probeResult="failure" output=< Mar 18 14:36:53 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:36:53 crc kubenswrapper[4912]: > Mar 18 14:37:02 crc kubenswrapper[4912]: I0318 14:37:02.238209 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:37:02 crc kubenswrapper[4912]: E0318 14:37:02.240783 4912 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vsp6g_openshift-machine-config-operator(c0c45cd5-793c-419f-8fe6-a2239050972e)\"" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" Mar 18 14:37:02 crc kubenswrapper[4912]: I0318 14:37:02.430123 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:37:02 crc kubenswrapper[4912]: I0318 14:37:02.518880 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:37:02 crc kubenswrapper[4912]: I0318 14:37:02.694098 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:37:03 crc kubenswrapper[4912]: I0318 14:37:03.641294 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ql4v6" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="registry-server" containerID="cri-o://181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1" gracePeriod=2 Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.256030 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.396767 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities\") pod \"695f9961-1df1-4373-ac1a-05efed2736a0\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.396862 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tjn\" (UniqueName: \"kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn\") pod \"695f9961-1df1-4373-ac1a-05efed2736a0\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.397030 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content\") pod \"695f9961-1df1-4373-ac1a-05efed2736a0\" (UID: \"695f9961-1df1-4373-ac1a-05efed2736a0\") " Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.397870 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities" (OuterVolumeSpecName: "utilities") pod "695f9961-1df1-4373-ac1a-05efed2736a0" (UID: "695f9961-1df1-4373-ac1a-05efed2736a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.398366 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.404700 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn" (OuterVolumeSpecName: "kube-api-access-k4tjn") pod "695f9961-1df1-4373-ac1a-05efed2736a0" (UID: "695f9961-1df1-4373-ac1a-05efed2736a0"). InnerVolumeSpecName "kube-api-access-k4tjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.457671 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "695f9961-1df1-4373-ac1a-05efed2736a0" (UID: "695f9961-1df1-4373-ac1a-05efed2736a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.501821 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tjn\" (UniqueName: \"kubernetes.io/projected/695f9961-1df1-4373-ac1a-05efed2736a0-kube-api-access-k4tjn\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.501871 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695f9961-1df1-4373-ac1a-05efed2736a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.664980 4912 generic.go:334] "Generic (PLEG): container finished" podID="695f9961-1df1-4373-ac1a-05efed2736a0" containerID="181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1" exitCode=0 Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.665099 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerDied","Data":"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1"} Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.665148 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ql4v6" event={"ID":"695f9961-1df1-4373-ac1a-05efed2736a0","Type":"ContainerDied","Data":"3ae15031eb9fb7b8ad02e873cb3c218699775f87c6050e25e0d824e98efc5e3c"} Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.665179 4912 scope.go:117] "RemoveContainer" containerID="181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.665183 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ql4v6" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.720120 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.720224 4912 scope.go:117] "RemoveContainer" containerID="f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.736615 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ql4v6"] Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.781751 4912 scope.go:117] "RemoveContainer" containerID="2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.823836 4912 scope.go:117] "RemoveContainer" containerID="181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1" Mar 18 14:37:04 crc kubenswrapper[4912]: E0318 14:37:04.824864 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1\": container with ID starting with 181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1 not found: ID does not exist" containerID="181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.824936 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1"} err="failed to get container status \"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1\": rpc error: code = NotFound desc = could not find container \"181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1\": container with ID starting with 181684c01c7921d64e82d5f5d613ee65097b65bcb88fc396d064376dfbb5b7c1 not found: ID does not exist" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.824988 4912 scope.go:117] "RemoveContainer" containerID="f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776" Mar 18 14:37:04 crc kubenswrapper[4912]: E0318 14:37:04.825830 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776\": container with ID starting with f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776 not found: ID does not exist" containerID="f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.825905 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776"} err="failed to get container status \"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776\": rpc error: code = NotFound desc = could not find container \"f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776\": container with ID starting with f21b81f06c491102168db7a69f5700095bb24543d27b69c9118470b3237ed776 not found: ID does not exist" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.825951 4912 scope.go:117] "RemoveContainer" containerID="2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2" Mar 18 14:37:04 crc kubenswrapper[4912]: E0318 14:37:04.826610 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2\": container with ID starting with 2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2 not found: ID does not exist" containerID="2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2" Mar 18 14:37:04 crc kubenswrapper[4912]: I0318 14:37:04.826680 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2"} err="failed to get container status \"2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2\": rpc error: code = NotFound desc = could not find container \"2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2\": container with ID starting with 2e8d85796ea2c3afa2f19ebef42989226428d1b9db7107c709fb740c19b3bbd2 not found: ID does not exist" Mar 18 14:37:06 crc kubenswrapper[4912]: I0318 14:37:06.249279 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" path="/var/lib/kubelet/pods/695f9961-1df1-4373-ac1a-05efed2736a0/volumes" Mar 18 14:37:09 crc kubenswrapper[4912]: I0318 14:37:09.758703 4912 generic.go:334] "Generic (PLEG): container finished" podID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerID="b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5" exitCode=0 Mar 18 14:37:09 crc kubenswrapper[4912]: I0318 14:37:09.758893 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" event={"ID":"973a8efa-1884-42ca-92ff-901de8a4fb85","Type":"ContainerDied","Data":"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5"} Mar 18 14:37:09 crc kubenswrapper[4912]: I0318 14:37:09.760731 4912 scope.go:117] "RemoveContainer" containerID="b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5" Mar 18 14:37:10 crc kubenswrapper[4912]: I0318 14:37:10.067363 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfzh_must-gather-7mgpg_973a8efa-1884-42ca-92ff-901de8a4fb85/gather/0.log" Mar 18 14:37:17 crc kubenswrapper[4912]: I0318 14:37:17.228823 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:37:17 crc kubenswrapper[4912]: I0318 14:37:17.877370 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"e67315cf5ba5038c60598c4e6269b22af75c078721ab59e1bf37b409fd8cfe64"} Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.099390 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfzh/must-gather-7mgpg"] Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.100381 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="copy" containerID="cri-o://033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf" gracePeriod=2 Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.119435 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfzh/must-gather-7mgpg"] Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.667564 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfzh_must-gather-7mgpg_973a8efa-1884-42ca-92ff-901de8a4fb85/copy/0.log" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.668742 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.735544 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnpjh\" (UniqueName: \"kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh\") pod \"973a8efa-1884-42ca-92ff-901de8a4fb85\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.735611 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output\") pod \"973a8efa-1884-42ca-92ff-901de8a4fb85\" (UID: \"973a8efa-1884-42ca-92ff-901de8a4fb85\") " Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.750443 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh" (OuterVolumeSpecName: "kube-api-access-pnpjh") pod "973a8efa-1884-42ca-92ff-901de8a4fb85" (UID: "973a8efa-1884-42ca-92ff-901de8a4fb85"). InnerVolumeSpecName "kube-api-access-pnpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.840464 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnpjh\" (UniqueName: \"kubernetes.io/projected/973a8efa-1884-42ca-92ff-901de8a4fb85-kube-api-access-pnpjh\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.944412 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "973a8efa-1884-42ca-92ff-901de8a4fb85" (UID: "973a8efa-1884-42ca-92ff-901de8a4fb85"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.960004 4912 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfzh_must-gather-7mgpg_973a8efa-1884-42ca-92ff-901de8a4fb85/copy/0.log" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.960495 4912 generic.go:334] "Generic (PLEG): container finished" podID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerID="033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf" exitCode=143 Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.960570 4912 scope.go:117] "RemoveContainer" containerID="033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf" Mar 18 14:37:21 crc kubenswrapper[4912]: I0318 14:37:21.960597 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfzh/must-gather-7mgpg" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.000853 4912 scope.go:117] "RemoveContainer" containerID="b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.046482 4912 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/973a8efa-1884-42ca-92ff-901de8a4fb85-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.058421 4912 scope.go:117] "RemoveContainer" containerID="033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf" Mar 18 14:37:22 crc kubenswrapper[4912]: E0318 14:37:22.059981 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf\": container with ID starting with 033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf not found: ID does not exist" containerID="033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.060029 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf"} err="failed to get container status \"033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf\": rpc error: code = NotFound desc = could not find container \"033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf\": container with ID starting with 033bf5ea5b60675a17ceed9220c0b31fe9e8756e002f6fa10307ca65fdccafdf not found: ID does not exist" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.060075 4912 scope.go:117] "RemoveContainer" containerID="b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5" Mar 18 14:37:22 crc kubenswrapper[4912]: E0318 14:37:22.060641 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5\": container with ID starting with b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5 not found: ID does not exist" containerID="b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.060688 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5"} err="failed to get container status \"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5\": rpc error: code = NotFound desc = could not find container \"b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5\": container with ID starting with b17e744b67c2bd68be3229e43662852a8a1e134c31353572f9d848ff449d99c5 not found: ID does not exist" Mar 18 14:37:22 crc kubenswrapper[4912]: I0318 14:37:22.244652 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" path="/var/lib/kubelet/pods/973a8efa-1884-42ca-92ff-901de8a4fb85/volumes" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.594210 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:37:30 crc kubenswrapper[4912]: E0318 14:37:30.595830 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="registry-server" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.595855 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="registry-server" Mar 18 14:37:30 crc kubenswrapper[4912]: E0318 14:37:30.595877 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="extract-utilities" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.595887 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="extract-utilities" Mar 18 14:37:30 crc kubenswrapper[4912]: E0318 14:37:30.595921 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="gather" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.595930 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="gather" Mar 18 14:37:30 crc kubenswrapper[4912]: E0318 14:37:30.595964 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="copy" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.595977 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="copy" Mar 18 14:37:30 crc kubenswrapper[4912]: E0318 14:37:30.595999 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="extract-content" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.596007 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="extract-content" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.596348 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="695f9961-1df1-4373-ac1a-05efed2736a0" containerName="registry-server" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.596388 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="copy" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.596428 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a8efa-1884-42ca-92ff-901de8a4fb85" containerName="gather" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.599440 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.610911 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.740790 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq8m\" (UniqueName: \"kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.741077 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.741562 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.845100 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq8m\" (UniqueName: \"kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.845821 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.845979 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.846876 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.847167 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.875536 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq8m\" (UniqueName: \"kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m\") pod \"community-operators-sqhss\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:30 crc kubenswrapper[4912]: I0318 14:37:30.941247 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:31 crc kubenswrapper[4912]: I0318 14:37:31.522919 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:37:31 crc kubenswrapper[4912]: I0318 14:37:31.907426 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerStarted","Data":"fb5420a19f85a11a2dcda45e226e29f1f2a6387f2d99bea4129de2756c69be2b"} Mar 18 14:37:32 crc kubenswrapper[4912]: I0318 14:37:32.921272 4912 generic.go:334] "Generic (PLEG): container finished" podID="3e500b14-effa-4301-88bf-9a1667024e99" containerID="656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573" exitCode=0 Mar 18 14:37:32 crc kubenswrapper[4912]: I0318 14:37:32.921389 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerDied","Data":"656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573"} Mar 18 14:37:33 crc kubenswrapper[4912]: I0318 14:37:33.948523 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerStarted","Data":"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5"} Mar 18 14:37:37 crc kubenswrapper[4912]: I0318 14:37:37.008749 4912 generic.go:334] "Generic (PLEG): container finished" podID="3e500b14-effa-4301-88bf-9a1667024e99" containerID="90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5" exitCode=0 Mar 18 14:37:37 crc kubenswrapper[4912]: I0318 14:37:37.008946 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerDied","Data":"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5"} Mar 18 14:37:38 crc kubenswrapper[4912]: I0318 14:37:38.031398 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerStarted","Data":"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753"} Mar 18 14:37:38 crc kubenswrapper[4912]: I0318 14:37:38.079457 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqhss" podStartSLOduration=3.581400164 podStartE2EDuration="8.079426815s" podCreationTimestamp="2026-03-18 14:37:30 +0000 UTC" firstStartedPulling="2026-03-18 14:37:32.925329755 +0000 UTC m=+5701.384757170" lastFinishedPulling="2026-03-18 14:37:37.423356386 +0000 UTC m=+5705.882783821" observedRunningTime="2026-03-18 14:37:38.065609911 +0000 UTC m=+5706.525037356" watchObservedRunningTime="2026-03-18 14:37:38.079426815 +0000 UTC m=+5706.538854240" Mar 18 14:37:40 crc kubenswrapper[4912]: I0318 14:37:40.941918 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:40 crc kubenswrapper[4912]: I0318 14:37:40.943189 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:37:42 crc kubenswrapper[4912]: I0318 14:37:42.006612 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sqhss" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" probeResult="failure" output=< Mar 18 14:37:42 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:37:42 crc kubenswrapper[4912]: > Mar 18 14:37:52 crc kubenswrapper[4912]: I0318 14:37:52.001802 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sqhss" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" probeResult="failure" output=< Mar 18 14:37:52 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:37:52 crc kubenswrapper[4912]: > Mar 18 14:37:58 crc kubenswrapper[4912]: I0318 14:37:58.879134 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:37:58 crc kubenswrapper[4912]: I0318 14:37:58.884325 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:58 crc kubenswrapper[4912]: I0318 14:37:58.899073 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.017051 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.017135 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.017410 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bvd\" (UniqueName: \"kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.120449 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bvd\" (UniqueName: \"kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.120652 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.120761 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.121530 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.121658 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.157882 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bvd\" (UniqueName: \"kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd\") pod \"redhat-operators-wh9pm\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.245629 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:37:59 crc kubenswrapper[4912]: I0318 14:37:59.851942 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.169683 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564078-krsxl"] Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.173871 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.178974 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.179760 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.185479 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.188523 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-krsxl"] Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.263744 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djw7g\" (UniqueName: \"kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g\") pod \"auto-csr-approver-29564078-krsxl\" (UID: \"aaf6da5c-03a6-4ca1-8202-a1acd91106d8\") " pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.367362 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djw7g\" (UniqueName: \"kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g\") pod \"auto-csr-approver-29564078-krsxl\" (UID: \"aaf6da5c-03a6-4ca1-8202-a1acd91106d8\") " pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.402232 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djw7g\" (UniqueName: \"kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g\") pod \"auto-csr-approver-29564078-krsxl\" (UID: \"aaf6da5c-03a6-4ca1-8202-a1acd91106d8\") " pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.402850 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerStarted","Data":"841cbd55995636279fa8dced4e99cd59831478d7a6bab21ee24533f6ff83e555"} Mar 18 14:38:00 crc kubenswrapper[4912]: I0318 14:38:00.522117 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.006594 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:38:01 crc kubenswrapper[4912]: W0318 14:38:01.059314 4912 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaf6da5c_03a6_4ca1_8202_a1acd91106d8.slice/crio-ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d WatchSource:0}: Error finding container ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d: Status 404 returned error can't find the container with id ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.061301 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-krsxl"] Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.087390 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.421204 4912 generic.go:334] "Generic (PLEG): container finished" podID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerID="2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0" exitCode=0 Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.421303 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerDied","Data":"2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0"} Mar 18 14:38:01 crc kubenswrapper[4912]: I0318 14:38:01.426124 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-krsxl" event={"ID":"aaf6da5c-03a6-4ca1-8202-a1acd91106d8","Type":"ContainerStarted","Data":"ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d"} Mar 18 14:38:02 crc kubenswrapper[4912]: I0318 14:38:02.833808 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:38:02 crc kubenswrapper[4912]: I0318 14:38:02.838665 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqhss" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" containerID="cri-o://38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753" gracePeriod=2 Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.433864 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.465993 4912 generic.go:334] "Generic (PLEG): container finished" podID="3e500b14-effa-4301-88bf-9a1667024e99" containerID="38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753" exitCode=0 Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.466075 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerDied","Data":"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753"} Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.466110 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqhss" event={"ID":"3e500b14-effa-4301-88bf-9a1667024e99","Type":"ContainerDied","Data":"fb5420a19f85a11a2dcda45e226e29f1f2a6387f2d99bea4129de2756c69be2b"} Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.466131 4912 scope.go:117] "RemoveContainer" containerID="38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.466285 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqhss" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.470677 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerStarted","Data":"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639"} Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.474993 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content\") pod \"3e500b14-effa-4301-88bf-9a1667024e99\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.475107 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcq8m\" (UniqueName: \"kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m\") pod \"3e500b14-effa-4301-88bf-9a1667024e99\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.475193 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities\") pod \"3e500b14-effa-4301-88bf-9a1667024e99\" (UID: \"3e500b14-effa-4301-88bf-9a1667024e99\") " Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.476722 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities" (OuterVolumeSpecName: "utilities") pod "3e500b14-effa-4301-88bf-9a1667024e99" (UID: "3e500b14-effa-4301-88bf-9a1667024e99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.477800 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-krsxl" event={"ID":"aaf6da5c-03a6-4ca1-8202-a1acd91106d8","Type":"ContainerStarted","Data":"54ea98341a4003f148043b0c187b9ebb7690b797e62aaaf44a1f8834060d58e0"} Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.504600 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m" (OuterVolumeSpecName: "kube-api-access-lcq8m") pod "3e500b14-effa-4301-88bf-9a1667024e99" (UID: "3e500b14-effa-4301-88bf-9a1667024e99"). InnerVolumeSpecName "kube-api-access-lcq8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.510167 4912 scope.go:117] "RemoveContainer" containerID="90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.529858 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564078-krsxl" podStartSLOduration=2.451897963 podStartE2EDuration="3.52983221s" podCreationTimestamp="2026-03-18 14:38:00 +0000 UTC" firstStartedPulling="2026-03-18 14:38:01.062621669 +0000 UTC m=+5729.522049094" lastFinishedPulling="2026-03-18 14:38:02.140555916 +0000 UTC m=+5730.599983341" observedRunningTime="2026-03-18 14:38:03.519181412 +0000 UTC m=+5731.978608837" watchObservedRunningTime="2026-03-18 14:38:03.52983221 +0000 UTC m=+5731.989259635" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.570534 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e500b14-effa-4301-88bf-9a1667024e99" (UID: "3e500b14-effa-4301-88bf-9a1667024e99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.578829 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.578870 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcq8m\" (UniqueName: \"kubernetes.io/projected/3e500b14-effa-4301-88bf-9a1667024e99-kube-api-access-lcq8m\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.578880 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e500b14-effa-4301-88bf-9a1667024e99-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.582608 4912 scope.go:117] "RemoveContainer" containerID="656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.614908 4912 scope.go:117] "RemoveContainer" containerID="38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753" Mar 18 14:38:03 crc kubenswrapper[4912]: E0318 14:38:03.615538 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753\": container with ID starting with 38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753 not found: ID does not exist" containerID="38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.615603 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753"} err="failed to get container status \"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753\": rpc error: code = NotFound desc = could not find container \"38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753\": container with ID starting with 38ea8b89a6fd007dfe672ed87bcf1231cfcfc36304eda701c7324f4356a2a753 not found: ID does not exist" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.615628 4912 scope.go:117] "RemoveContainer" containerID="90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5" Mar 18 14:38:03 crc kubenswrapper[4912]: E0318 14:38:03.616399 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5\": container with ID starting with 90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5 not found: ID does not exist" containerID="90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.616527 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5"} err="failed to get container status \"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5\": rpc error: code = NotFound desc = could not find container \"90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5\": container with ID starting with 90059bc78e04920d9ba090b0103cad15cceacadbdd6e510cbce935d909ac56d5 not found: ID does not exist" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.616594 4912 scope.go:117] "RemoveContainer" containerID="656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573" Mar 18 14:38:03 crc kubenswrapper[4912]: E0318 14:38:03.617147 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573\": container with ID starting with 656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573 not found: ID does not exist" containerID="656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.617576 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573"} err="failed to get container status \"656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573\": rpc error: code = NotFound desc = could not find container \"656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573\": container with ID starting with 656cfe0d7455f9c83b0942d22ab7a687cdf401d7245b2665e5cf055077c74573 not found: ID does not exist" Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.862387 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:38:03 crc kubenswrapper[4912]: I0318 14:38:03.875879 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqhss"] Mar 18 14:38:04 crc kubenswrapper[4912]: I0318 14:38:04.249408 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e500b14-effa-4301-88bf-9a1667024e99" path="/var/lib/kubelet/pods/3e500b14-effa-4301-88bf-9a1667024e99/volumes" Mar 18 14:38:04 crc kubenswrapper[4912]: I0318 14:38:04.497017 4912 generic.go:334] "Generic (PLEG): container finished" podID="aaf6da5c-03a6-4ca1-8202-a1acd91106d8" containerID="54ea98341a4003f148043b0c187b9ebb7690b797e62aaaf44a1f8834060d58e0" exitCode=0 Mar 18 14:38:04 crc kubenswrapper[4912]: I0318 14:38:04.497175 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-krsxl" event={"ID":"aaf6da5c-03a6-4ca1-8202-a1acd91106d8","Type":"ContainerDied","Data":"54ea98341a4003f148043b0c187b9ebb7690b797e62aaaf44a1f8834060d58e0"} Mar 18 14:38:05 crc kubenswrapper[4912]: I0318 14:38:05.959739 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.063015 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djw7g\" (UniqueName: \"kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g\") pod \"aaf6da5c-03a6-4ca1-8202-a1acd91106d8\" (UID: \"aaf6da5c-03a6-4ca1-8202-a1acd91106d8\") " Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.072460 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g" (OuterVolumeSpecName: "kube-api-access-djw7g") pod "aaf6da5c-03a6-4ca1-8202-a1acd91106d8" (UID: "aaf6da5c-03a6-4ca1-8202-a1acd91106d8"). InnerVolumeSpecName "kube-api-access-djw7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.168351 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djw7g\" (UniqueName: \"kubernetes.io/projected/aaf6da5c-03a6-4ca1-8202-a1acd91106d8-kube-api-access-djw7g\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.537669 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-krsxl" event={"ID":"aaf6da5c-03a6-4ca1-8202-a1acd91106d8","Type":"ContainerDied","Data":"ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d"} Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.538659 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff4158872134849d1bb3f234990697c821730a425af10922830eefa5e552e09d" Mar 18 14:38:06 crc kubenswrapper[4912]: I0318 14:38:06.537921 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-krsxl" Mar 18 14:38:07 crc kubenswrapper[4912]: I0318 14:38:07.088391 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n94qm"] Mar 18 14:38:07 crc kubenswrapper[4912]: I0318 14:38:07.109590 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n94qm"] Mar 18 14:38:08 crc kubenswrapper[4912]: I0318 14:38:08.245058 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d84325-2fba-463a-9c88-48eb40e0e43e" path="/var/lib/kubelet/pods/20d84325-2fba-463a-9c88-48eb40e0e43e/volumes" Mar 18 14:38:08 crc kubenswrapper[4912]: I0318 14:38:08.582729 4912 generic.go:334] "Generic (PLEG): container finished" podID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerID="05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639" exitCode=0 Mar 18 14:38:08 crc kubenswrapper[4912]: I0318 14:38:08.582796 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerDied","Data":"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639"} Mar 18 14:38:09 crc kubenswrapper[4912]: I0318 14:38:09.607553 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerStarted","Data":"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa"} Mar 18 14:38:09 crc kubenswrapper[4912]: I0318 14:38:09.635476 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wh9pm" podStartSLOduration=4.052998896 podStartE2EDuration="11.635445166s" podCreationTimestamp="2026-03-18 14:37:58 +0000 UTC" firstStartedPulling="2026-03-18 14:38:01.425785359 +0000 UTC m=+5729.885212784" lastFinishedPulling="2026-03-18 14:38:09.008231629 +0000 UTC m=+5737.467659054" observedRunningTime="2026-03-18 14:38:09.631809427 +0000 UTC m=+5738.091236872" watchObservedRunningTime="2026-03-18 14:38:09.635445166 +0000 UTC m=+5738.094872591" Mar 18 14:38:17 crc kubenswrapper[4912]: I0318 14:38:17.631203 4912 scope.go:117] "RemoveContainer" containerID="4f469a18c9d96bfb6f51ea966f431008006fdf353872583bae36a986335d18e2" Mar 18 14:38:19 crc kubenswrapper[4912]: I0318 14:38:19.246156 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:38:19 crc kubenswrapper[4912]: I0318 14:38:19.248989 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:38:20 crc kubenswrapper[4912]: I0318 14:38:20.314593 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wh9pm" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" probeResult="failure" output=< Mar 18 14:38:20 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:38:20 crc kubenswrapper[4912]: > Mar 18 14:38:28 crc kubenswrapper[4912]: I0318 14:38:28.325541 4912 trace.go:236] Trace[398849788]: "Calculate volume metrics of iptables-alerter-script for pod openshift-network-operator/iptables-alerter-4ln5h" (18-Mar-2026 14:38:27.276) (total time: 1047ms): Mar 18 14:38:28 crc kubenswrapper[4912]: Trace[398849788]: [1.047844313s] [1.047844313s] END Mar 18 14:38:28 crc kubenswrapper[4912]: I0318 14:38:28.732718 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:38:28 crc kubenswrapper[4912]: I0318 14:38:28.735573 4912 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="77736799-2ebe-4076-9717-6741aed93599" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:38:30 crc kubenswrapper[4912]: I0318 14:38:30.311180 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wh9pm" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" probeResult="failure" output=< Mar 18 14:38:30 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:38:30 crc kubenswrapper[4912]: > Mar 18 14:38:40 crc kubenswrapper[4912]: I0318 14:38:40.308854 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wh9pm" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" probeResult="failure" output=< Mar 18 14:38:40 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:38:40 crc kubenswrapper[4912]: > Mar 18 14:38:50 crc kubenswrapper[4912]: I0318 14:38:50.311033 4912 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wh9pm" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" probeResult="failure" output=< Mar 18 14:38:50 crc kubenswrapper[4912]: timeout: failed to connect service ":50051" within 1s Mar 18 14:38:50 crc kubenswrapper[4912]: > Mar 18 14:38:59 crc kubenswrapper[4912]: I0318 14:38:59.313609 4912 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:38:59 crc kubenswrapper[4912]: I0318 14:38:59.384299 4912 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:39:00 crc kubenswrapper[4912]: I0318 14:39:00.144105 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:39:00 crc kubenswrapper[4912]: I0318 14:39:00.368867 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wh9pm" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" containerID="cri-o://62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa" gracePeriod=2 Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.043983 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.090025 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content\") pod \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.090555 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bvd\" (UniqueName: \"kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd\") pod \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.091474 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities\") pod \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\" (UID: \"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1\") " Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.092213 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities" (OuterVolumeSpecName: "utilities") pod "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" (UID: "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.110644 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd" (OuterVolumeSpecName: "kube-api-access-w5bvd") pod "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" (UID: "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1"). InnerVolumeSpecName "kube-api-access-w5bvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.201624 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5bvd\" (UniqueName: \"kubernetes.io/projected/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-kube-api-access-w5bvd\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.202028 4912 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.300933 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" (UID: "3d9ffe4a-3554-4f64-9ba4-62996cb5cca1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.314748 4912 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.386833 4912 generic.go:334] "Generic (PLEG): container finished" podID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerID="62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa" exitCode=0 Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.386910 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerDied","Data":"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa"} Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.386954 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9pm" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.387015 4912 scope.go:117] "RemoveContainer" containerID="62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.386970 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9pm" event={"ID":"3d9ffe4a-3554-4f64-9ba4-62996cb5cca1","Type":"ContainerDied","Data":"841cbd55995636279fa8dced4e99cd59831478d7a6bab21ee24533f6ff83e555"} Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.420284 4912 scope.go:117] "RemoveContainer" containerID="05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.435191 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.459539 4912 scope.go:117] "RemoveContainer" containerID="2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.478825 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wh9pm"] Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.523153 4912 scope.go:117] "RemoveContainer" containerID="62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa" Mar 18 14:39:01 crc kubenswrapper[4912]: E0318 14:39:01.523652 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa\": container with ID starting with 62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa not found: ID does not exist" containerID="62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.523687 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa"} err="failed to get container status \"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa\": rpc error: code = NotFound desc = could not find container \"62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa\": container with ID starting with 62ad9689125ff6d0e12853d24378d1846dcc98001951c244a38eaaf3eb4ababa not found: ID does not exist" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.523716 4912 scope.go:117] "RemoveContainer" containerID="05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639" Mar 18 14:39:01 crc kubenswrapper[4912]: E0318 14:39:01.524686 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639\": container with ID starting with 05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639 not found: ID does not exist" containerID="05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.524711 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639"} err="failed to get container status \"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639\": rpc error: code = NotFound desc = could not find container \"05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639\": container with ID starting with 05bd9cf7cf0fa0f8c96cee37e1f3bb82224a68dfe233aa0a862f896c7b0b6639 not found: ID does not exist" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.524727 4912 scope.go:117] "RemoveContainer" containerID="2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0" Mar 18 14:39:01 crc kubenswrapper[4912]: E0318 14:39:01.524935 4912 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0\": container with ID starting with 2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0 not found: ID does not exist" containerID="2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0" Mar 18 14:39:01 crc kubenswrapper[4912]: I0318 14:39:01.524957 4912 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0"} err="failed to get container status \"2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0\": rpc error: code = NotFound desc = could not find container \"2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0\": container with ID starting with 2862852a29b3850107b94502fabd747f453579f83d73b56a2bdc68b6e4b1a7e0 not found: ID does not exist" Mar 18 14:39:02 crc kubenswrapper[4912]: I0318 14:39:02.243723 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" path="/var/lib/kubelet/pods/3d9ffe4a-3554-4f64-9ba4-62996cb5cca1/volumes" Mar 18 14:39:10 crc kubenswrapper[4912]: I0318 14:39:10.995972 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6f59b977c9-rwwx4" podUID="08a4effe-9a7e-449c-aba4-74d4b7a4f0ae" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 14:39:37 crc kubenswrapper[4912]: I0318 14:39:36.999353 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:39:37 crc kubenswrapper[4912]: I0318 14:39:37.000459 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.166891 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564080-p2pb4"] Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.168806 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.168832 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.168863 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.168875 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.168902 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.168912 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.168938 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.168947 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.168995 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf6da5c-03a6-4ca1-8202-a1acd91106d8" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169007 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf6da5c-03a6-4ca1-8202-a1acd91106d8" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.169027 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169062 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4912]: E0318 14:40:00.169097 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169108 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169432 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9ffe4a-3554-4f64-9ba4-62996cb5cca1" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169471 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf6da5c-03a6-4ca1-8202-a1acd91106d8" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.169489 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e500b14-effa-4301-88bf-9a1667024e99" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.170810 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.173402 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.174225 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.174238 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.197405 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-p2pb4"] Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.251347 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mjw\" (UniqueName: \"kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw\") pod \"auto-csr-approver-29564080-p2pb4\" (UID: \"00a304b3-bf52-40bb-9ede-39e8d7333d61\") " pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.355373 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mjw\" (UniqueName: \"kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw\") pod \"auto-csr-approver-29564080-p2pb4\" (UID: \"00a304b3-bf52-40bb-9ede-39e8d7333d61\") " pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.378545 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mjw\" (UniqueName: \"kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw\") pod \"auto-csr-approver-29564080-p2pb4\" (UID: \"00a304b3-bf52-40bb-9ede-39e8d7333d61\") " pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:00 crc kubenswrapper[4912]: I0318 14:40:00.506167 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:01 crc kubenswrapper[4912]: I0318 14:40:01.104583 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-p2pb4"] Mar 18 14:40:01 crc kubenswrapper[4912]: I0318 14:40:01.124084 4912 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:40:01 crc kubenswrapper[4912]: I0318 14:40:01.214892 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" event={"ID":"00a304b3-bf52-40bb-9ede-39e8d7333d61","Type":"ContainerStarted","Data":"b1e5e962f6470e087f02231857ebf7573d9289c28da0eeaecb4768a585781e0d"} Mar 18 14:40:03 crc kubenswrapper[4912]: I0318 14:40:03.258562 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" event={"ID":"00a304b3-bf52-40bb-9ede-39e8d7333d61","Type":"ContainerStarted","Data":"09470d7f885862308d8074781250385f8bc170da39c21ee1e6f5dc66d23e3aa2"} Mar 18 14:40:03 crc kubenswrapper[4912]: I0318 14:40:03.312367 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" podStartSLOduration=2.247890896 podStartE2EDuration="3.312338509s" podCreationTimestamp="2026-03-18 14:40:00 +0000 UTC" firstStartedPulling="2026-03-18 14:40:01.103326095 +0000 UTC m=+5849.562753520" lastFinishedPulling="2026-03-18 14:40:02.167773708 +0000 UTC m=+5850.627201133" observedRunningTime="2026-03-18 14:40:03.284544527 +0000 UTC m=+5851.743971972" watchObservedRunningTime="2026-03-18 14:40:03.312338509 +0000 UTC m=+5851.771765934" Mar 18 14:40:04 crc kubenswrapper[4912]: I0318 14:40:04.277561 4912 generic.go:334] "Generic (PLEG): container finished" podID="00a304b3-bf52-40bb-9ede-39e8d7333d61" containerID="09470d7f885862308d8074781250385f8bc170da39c21ee1e6f5dc66d23e3aa2" exitCode=0 Mar 18 14:40:04 crc kubenswrapper[4912]: I0318 14:40:04.277765 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" event={"ID":"00a304b3-bf52-40bb-9ede-39e8d7333d61","Type":"ContainerDied","Data":"09470d7f885862308d8074781250385f8bc170da39c21ee1e6f5dc66d23e3aa2"} Mar 18 14:40:05 crc kubenswrapper[4912]: I0318 14:40:05.794719 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:05 crc kubenswrapper[4912]: I0318 14:40:05.876767 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5mjw\" (UniqueName: \"kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw\") pod \"00a304b3-bf52-40bb-9ede-39e8d7333d61\" (UID: \"00a304b3-bf52-40bb-9ede-39e8d7333d61\") " Mar 18 14:40:05 crc kubenswrapper[4912]: I0318 14:40:05.885687 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw" (OuterVolumeSpecName: "kube-api-access-l5mjw") pod "00a304b3-bf52-40bb-9ede-39e8d7333d61" (UID: "00a304b3-bf52-40bb-9ede-39e8d7333d61"). InnerVolumeSpecName "kube-api-access-l5mjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:40:05 crc kubenswrapper[4912]: I0318 14:40:05.982333 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5mjw\" (UniqueName: \"kubernetes.io/projected/00a304b3-bf52-40bb-9ede-39e8d7333d61-kube-api-access-l5mjw\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.316781 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" event={"ID":"00a304b3-bf52-40bb-9ede-39e8d7333d61","Type":"ContainerDied","Data":"b1e5e962f6470e087f02231857ebf7573d9289c28da0eeaecb4768a585781e0d"} Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.317326 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e5e962f6470e087f02231857ebf7573d9289c28da0eeaecb4768a585781e0d" Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.316899 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-p2pb4" Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.396343 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-97bwz"] Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.415670 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-97bwz"] Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.999348 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:40:06 crc kubenswrapper[4912]: I0318 14:40:06.999417 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:40:08 crc kubenswrapper[4912]: I0318 14:40:08.248538 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6268ac-bc25-4cf3-a635-9aa2412b10aa" path="/var/lib/kubelet/pods/8f6268ac-bc25-4cf3-a635-9aa2412b10aa/volumes" Mar 18 14:40:17 crc kubenswrapper[4912]: I0318 14:40:17.904137 4912 scope.go:117] "RemoveContainer" containerID="30e82b8ced5effa764d511bff534922a40bcd0d146ac7e59699a64e9e6020494" Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:36.999510 4912 patch_prober.go:28] interesting pod/machine-config-daemon-vsp6g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.000384 4912 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.000472 4912 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.002075 4912 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e67315cf5ba5038c60598c4e6269b22af75c078721ab59e1bf37b409fd8cfe64"} pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.002294 4912 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" podUID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerName="machine-config-daemon" containerID="cri-o://e67315cf5ba5038c60598c4e6269b22af75c078721ab59e1bf37b409fd8cfe64" gracePeriod=600 Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.782639 4912 generic.go:334] "Generic (PLEG): container finished" podID="c0c45cd5-793c-419f-8fe6-a2239050972e" containerID="e67315cf5ba5038c60598c4e6269b22af75c078721ab59e1bf37b409fd8cfe64" exitCode=0 Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.782892 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerDied","Data":"e67315cf5ba5038c60598c4e6269b22af75c078721ab59e1bf37b409fd8cfe64"} Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.783192 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vsp6g" event={"ID":"c0c45cd5-793c-419f-8fe6-a2239050972e","Type":"ContainerStarted","Data":"2f72dfee533741deff1489a869fb1ba2c96d2aafc3d43dbc685d22503ca97bfe"} Mar 18 14:40:37 crc kubenswrapper[4912]: I0318 14:40:37.783222 4912 scope.go:117] "RemoveContainer" containerID="21dbeb64542fe95e57b2591d2ca5538d9cec241e64defdee52491eddfc779589" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.157605 4912 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564082-nqm89"] Mar 18 14:42:00 crc kubenswrapper[4912]: E0318 14:42:00.159631 4912 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a304b3-bf52-40bb-9ede-39e8d7333d61" containerName="oc" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.159654 4912 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a304b3-bf52-40bb-9ede-39e8d7333d61" containerName="oc" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.160022 4912 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a304b3-bf52-40bb-9ede-39e8d7333d61" containerName="oc" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.161421 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.168540 4912 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k2ghg" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.169141 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.171280 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-nqm89"] Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.171603 4912 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.274898 4912 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqc9\" (UniqueName: \"kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9\") pod \"auto-csr-approver-29564082-nqm89\" (UID: \"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994\") " pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.378263 4912 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqc9\" (UniqueName: \"kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9\") pod \"auto-csr-approver-29564082-nqm89\" (UID: \"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994\") " pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.400999 4912 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqc9\" (UniqueName: \"kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9\") pod \"auto-csr-approver-29564082-nqm89\" (UID: \"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994\") " pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:00 crc kubenswrapper[4912]: I0318 14:42:00.491424 4912 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:01 crc kubenswrapper[4912]: I0318 14:42:01.111299 4912 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-nqm89"] Mar 18 14:42:02 crc kubenswrapper[4912]: I0318 14:42:02.093424 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-nqm89" event={"ID":"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994","Type":"ContainerStarted","Data":"651d842d2f963da41e8b01d30eed68b413ecaee34edecb595ab3e6d41d9320b8"} Mar 18 14:42:03 crc kubenswrapper[4912]: I0318 14:42:03.127937 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-nqm89" event={"ID":"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994","Type":"ContainerStarted","Data":"0884413b2b6cfdefc5a6318c4f477140522f79cf69fed7e30f62c4a3ce2ded8b"} Mar 18 14:42:03 crc kubenswrapper[4912]: I0318 14:42:03.166356 4912 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564082-nqm89" podStartSLOduration=1.908181898 podStartE2EDuration="3.166327252s" podCreationTimestamp="2026-03-18 14:42:00 +0000 UTC" firstStartedPulling="2026-03-18 14:42:01.111458642 +0000 UTC m=+5969.570886077" lastFinishedPulling="2026-03-18 14:42:02.369604006 +0000 UTC m=+5970.829031431" observedRunningTime="2026-03-18 14:42:03.15922494 +0000 UTC m=+5971.618652385" watchObservedRunningTime="2026-03-18 14:42:03.166327252 +0000 UTC m=+5971.625754687" Mar 18 14:42:04 crc kubenswrapper[4912]: I0318 14:42:04.143758 4912 generic.go:334] "Generic (PLEG): container finished" podID="6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994" containerID="0884413b2b6cfdefc5a6318c4f477140522f79cf69fed7e30f62c4a3ce2ded8b" exitCode=0 Mar 18 14:42:04 crc kubenswrapper[4912]: I0318 14:42:04.143806 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-nqm89" event={"ID":"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994","Type":"ContainerDied","Data":"0884413b2b6cfdefc5a6318c4f477140522f79cf69fed7e30f62c4a3ce2ded8b"} Mar 18 14:42:05 crc kubenswrapper[4912]: I0318 14:42:05.685346 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:05 crc kubenswrapper[4912]: I0318 14:42:05.742015 4912 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqc9\" (UniqueName: \"kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9\") pod \"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994\" (UID: \"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994\") " Mar 18 14:42:05 crc kubenswrapper[4912]: I0318 14:42:05.754963 4912 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9" (OuterVolumeSpecName: "kube-api-access-ckqc9") pod "6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994" (UID: "6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994"). InnerVolumeSpecName "kube-api-access-ckqc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:42:05 crc kubenswrapper[4912]: I0318 14:42:05.847861 4912 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqc9\" (UniqueName: \"kubernetes.io/projected/6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994-kube-api-access-ckqc9\") on node \"crc\" DevicePath \"\"" Mar 18 14:42:06 crc kubenswrapper[4912]: I0318 14:42:06.177316 4912 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-nqm89" event={"ID":"6eca6e8d-6e9c-4dad-8e2a-e03fbb5af994","Type":"ContainerDied","Data":"651d842d2f963da41e8b01d30eed68b413ecaee34edecb595ab3e6d41d9320b8"} Mar 18 14:42:06 crc kubenswrapper[4912]: I0318 14:42:06.177363 4912 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651d842d2f963da41e8b01d30eed68b413ecaee34edecb595ab3e6d41d9320b8" Mar 18 14:42:06 crc kubenswrapper[4912]: I0318 14:42:06.177419 4912 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-nqm89" Mar 18 14:42:06 crc kubenswrapper[4912]: I0318 14:42:06.770567 4912 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-vcw5x"] Mar 18 14:42:06 crc kubenswrapper[4912]: I0318 14:42:06.782246 4912 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-vcw5x"] Mar 18 14:42:08 crc kubenswrapper[4912]: I0318 14:42:08.244330 4912 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c215a29-03d9-4b1c-9057-afae168546e1" path="/var/lib/kubelet/pods/4c215a29-03d9-4b1c-9057-afae168546e1/volumes" Mar 18 14:42:18 crc kubenswrapper[4912]: I0318 14:42:18.093588 4912 scope.go:117] "RemoveContainer" containerID="42f881605576ba27a2c6bf238be35b82e2e6e28a5bf940c80b1fdc666bf25f39"